Lab 2 - Training a Data Driven Model for Stellar Label Classification

Abstract

Ever-increasing sizes of stellar spectroscopy surveys put a strain on conventional label-predicting pipelines, and more data-driven approaches are necessary. In this lab, we develop a data-driven generative model for stellar label prediction. Chosen, low-noise spectra from the APOGEE survey are combined with trusted labels from the ASPCAP pipeline, to serve as a training and testing set. Using this data, we initially develop a model to predict stellar spectrum from five labels, Teff, Log(g), Fe/H, Mg/Fe, and Si/Fe. This model is then reversed to a label-from-spectrum predicting model, using maximum likelihood methods. As an alternative to this approach, we train a label-from-spectrum predicting neural network, trained and tested on the same datasets. We find that the first predictive model has good predictive capabilities, with values deviating from ASPCAP by RMSE 74.32, 0.1543, 0.0537, 0.0516, 0.0558, for Teff, Log(g), Fe/H, Mg/Fe, and Si/Fe, respectively. The simplistic neural network fall slightly behind, with values ~30% higher. This gap is almost entirely closed in a second neural network implementation, using data augmentation and more epochs.

1. Introduction

1.1 APOGEE

The APOGEE survey contains vast amounts of high resolution (R~22'500) spectroscopy data from 146'000 stars in the Milky Way. The survey is done in a short infrared (1.51-1.70 µm) band, giving ~8000 wavelength pixels over three almost-overlapping bands.

1.2 Label Prediction - ASPCAP and data-driven approaches

The APOGEE data comes with labels derived using the ASPCAP pipeline, which uses known physical properties of the stars (with some substantial approximations) to derive their labels from the spectrums. This approach is extremely time-consuming, and alternate, more data-driven approaches, like the Cannon (presented in Ness et al.) have been suggested. Such models hold certain advantages. They, first of all, tend to be much cheaper than physics-driven models. While training the model can be expensive, new predictions are usually very cheap. With large and good enough training sets, they could also learn features that might be lost in the approximations of a physical model.

We will in this lab create such a data driven model. The underlying idea is simply that there should be a deterministic relationship between some complete set of stellar labels, and the stellar spectrum. This relationship should be learnable through pure data-comparison between accurate labels and spectrum. The cannon employs three such labels, Teff, Log(g), and Fe/G, and we will employ two additional labels, Mg/Fe, and Si/Fe. The set of labels will in practice never be fully exhaustive, but the addition of more labels could increase our models' completeness.

2. Data Gathering

2.1 APOGEE Data Fetch

APOGEE provides their spectrum data in a series of ".fits" files, accessible through an archive on their webpage. The files are sorted in directories after their location ID and named after their object ID. Since we're only looking at 4 location IDs, we do a separate request of all files in these 4 directories using wget. The request is done using the following wget commands, to ensure only the desired files are requested, and that nothing else funky happens:

  • -q --quiet (prevents wget prints.)
  • -r --recursive (makes wget search through all hyperlinks in the provided directory, which contains our desired files.)
  • -np --no-parent (prevents wget from recursively searching parent directories, so we don't download the entire internet.)
  • -nd --no-directories (prevents wget from downloading directories.)
  • -e robots=off (executes the command "robots=off", which ensures wget doesn't search the robot.txt anti-parse file included in the webpage.)
  • -P --directory-prefix (where we want to save our files.)
  • -A --accept (only include files following this pattern.)
  • -R --reject (exclude files following this pattern. There is a bunch of similar-looking files in the directory. Don't know what they are, but we don't want them.)

while these "apStar" files contain the APOGEE spectrum data, the ASPCAP derived labels are all held in one "allStar" file. We also include this in our fetch.

Below we execute this pulling down ~2700 files, for a combined of ~8GB.

If you already have local access to these files, please create a directory stars/ in the same folder as this notebook is run. In this, put the allStar file, as well as 4 directories by the name of the 4 location IDs (stars/4230/, stars/4262/, ...), which should each contain their respective apStar files.

In [0]:
# ! mkdir -p stars/4230/
# ! mkdir -p stars/4262/
# ! mkdir -p stars/5162/
# ! mkdir -p stars/4241/

# ! wget -r -q -np -nd -e robots=off -P stars/4230/ -A 'apStar-r8-*.fits' -R 'apStar-r8-*-*.fits' https://data.sdss.org/sas/dr14/apogee/spectro/redux/r8/stars/apo25m/4230/
# ! wget -r -q -np -nd -e robots=off -P stars/4262/ -A 'apStar-r8-*.fits' -R 'apStar-r8-*-*.fits' https://data.sdss.org/sas/dr14/apogee/spectro/redux/r8/stars/apo25m/4262/
# ! wget -r -q -np -nd -e robots=off -P stars/5162/ -A 'apStar-r8-*.fits' -R 'apStar-r8-*-*.fits' https://data.sdss.org/sas/dr14/apogee/spectro/redux/r8/stars/apo25m/5162/
# ! wget -r -q -np -nd -e robots=off -P stars/4241/ -A 'apStar-r8-*.fits' -R 'apStar-r8-*-*.fits' https://data.sdss.org/sas/dr14/apogee/spectro/redux/r8/stars/apo25m/4241/
# ! wget -q -P stars/ https://data.sdss.org/sas/dr14/apogee/spectro/redux/r8/stars/l31c/l31c.2/allStar-l31c.2.fits
In [2]:
# Google Colab prerequisits
# ! pip install corner
# ! pip install emcee
# from google.colab import files
# uploaded = files.upload()
# from google.colab import drive
# drive.mount('/content/gdrive')
# ! cp -r '/content/gdrive/My Drive/Colab Notebooks/stars' '.'
Requirement already satisfied: corner in /usr/local/lib/python3.6/dist-packages (2.0.1)
Requirement already satisfied: numpy in /usr/local/lib/python3.6/dist-packages (from corner) (1.14.6)
Requirement already satisfied: matplotlib in /usr/local/lib/python3.6/dist-packages (from corner) (3.0.3)
Requirement already satisfied: kiwisolver>=1.0.1 in /usr/local/lib/python3.6/dist-packages (from matplotlib->corner) (1.0.1)
Requirement already satisfied: pyparsing!=2.0.4,!=2.1.2,!=2.1.6,>=2.0.1 in /usr/local/lib/python3.6/dist-packages (from matplotlib->corner) (2.3.1)
Requirement already satisfied: python-dateutil>=2.1 in /usr/local/lib/python3.6/dist-packages (from matplotlib->corner) (2.5.3)
Requirement already satisfied: cycler>=0.10 in /usr/local/lib/python3.6/dist-packages (from matplotlib->corner) (0.10.0)
Requirement already satisfied: setuptools in /usr/local/lib/python3.6/dist-packages (from kiwisolver>=1.0.1->matplotlib->corner) (40.8.0)
Requirement already satisfied: six>=1.5 in /usr/local/lib/python3.6/dist-packages (from python-dateutil>=2.1->matplotlib->corner) (1.11.0)
Requirement already satisfied: emcee in /usr/local/lib/python3.6/dist-packages (2.2.1)
Requirement already satisfied: numpy in /usr/local/lib/python3.6/dist-packages (from emcee) (1.14.6)

2.2 Practical Considerations

We're gonna be storing all data in numpy arrays. ID data gets their own arrays, while labels are stored together in a 2D array. Flux and flux error data are stored in a 2-dimensional array. All data is matched by index on the first axis, corresponding to respective stars.

Below, we parse through at all the filenames we fetched with wget to extract object IDs and location IDs from these filenames to our arrays. We then parse through all the files and extract their flux and flux error data, as well as an error bitmask for later use.

2.3 Combining the spectra

Now, there are several flux (and error) spectra for each star in the .fits files, corresponding to different visits. Each visit is simply a spectrum observation done over some continuous exposure time. Multiple visits are usually performed for each star, at different times. These visits will need to be Doppler shifted to some reference velocity, since they might have been done at different points in the Earth's orbit, and therefore have different relative velocities, causing different red/blue-shifts.

Luckily for us, the spectrum data for each visit has both been Doppler shifted, and neatly combined into two combined spectra. The visits are combined into two combined spectra, using different weighting. The first is pixel-based, and the second uses a more combined, global weighing. We will be using the latter.

In [0]:
import sys
import numpy as np
import matplotlib.pyplot as plt
import matplotlib as mpl
import pandas as pd
import corner
import scipy.interpolate
import astropy
import seaborn as sns
# import mpld3
#mpld3.enable_notebook()
from tqdm import tqdm
from astropy.io import fits
from os import listdir
from os.path import isfile, join
import warnings

warnings.filterwarnings('ignore')
%matplotlib inline
plt.style.use('seaborn-darkgrid')
mpl.rcParams['figure.figsize'] = [10.0, 4.0]
mpl.rcParams['figure.dpi'] = 80
mpl.rcParams['savefig.dpi'] = 100
mpl.rcParams['font.size'] = 9
mpl.rcParams['legend.fontsize'] = 'Large'
mpl.rcParams['figure.titlesize'] = 'Small'

np.random.seed(1942)
In [4]:
obj_ids = []
loc_ids = []
target_ids = []
for loc in ['4230', '4262', '5162', '4241']:
    filenames = listdir('stars/' + loc)
    for name in filenames:
        if isfile('stars/' + loc + '/' + name):
            name = name.split('r8-')[1]
            name = name.split('.fits')[0]
            obj_ids.append(name)
            loc_ids.append(loc)
            target_ids.append(loc + '.' + name)

nr_stars = len(obj_ids); print(f"Fetched data for {nr_stars} stars.")
Fetched data for 2723 stars.
In [5]:
nr_pixels = 8575  # 8575 seems to be the number of flux-pixels in each dataset.
spec = np.zeros((nr_stars, nr_pixels), dtype=np.float32)  # APOGEE gives data in float32
spec_error = np.zeros((nr_stars, nr_pixels), dtype=np.float32)
bitmask = np.zeros((nr_stars, nr_pixels), dtype=int)

for i in tqdm(range(nr_stars)):
    filename = 'stars/' + loc_ids[i] + '/apStar-r8-' + obj_ids[i] + '.fits'
    with fits.open(filename) as infile:
        spec[i] = (infile[1].data)[1]
        spec_error[i] = (infile[2].data)[1]
        bitmask[i] = (infile[3].data)[1]

with fits.open(filename) as infile:
    CRVAL1 = infile[0].header['CRVAL1']
    CDELT1 = infile[0].header['CDELT1']
wl = np.logspace(CRVAL1, CRVAL1 + (nr_pixels-1)*CDELT1, nr_pixels)
100%|██████████| 2723/2723 [00:38<00:00, 70.00it/s]

2.4 First Look

We now have data for spectrum and error data, as well as object and location IDs, all matched by index.

spec and spec_error is (2723)x(8575) arrays, as there are initially 2723 stars, each with flux observations at 8575 different wavelengths. Below we have plotted the spectrum and error spectrum of 3 chosen stars. The entire spectrums are plotted to the left, and a zoomed in, more readable version is on the right. Since the errors are very small, and the data very noisy, plotting error bars on the data is inconvenient.

A few things are immediately apparent. There are pixels with no flux-data in all 3 spectrums. These are the regions outside, or in between, the 3 bands in the APOGEE survey. The wavelengths are still included, but the flux is set to 0, and they are marked with errors of 1e10 (Or, most of them are. We explore this below). On the more readable spectrums on the right, we see that the errors tend to have a convex form, with most lying on a rather well-defined curve, with some outliers. The data also has some overall trends, decaying or increasing towards higher wavelengths. If we consider stars perfect blackbodies, the curves should obey Planck's law, and the general trend is a result of being on the left or right of the peak in the blackbody radiation spectrum. Errors and instrumental considerations also come into play. We will explore these general trends later when we attempt to normalize the flux across all stars.

In [6]:
for i in range(3):
    j = np.random.randint(0, nr_stars)
    spectrum_1 = spec[j]
    spec_error_1 = spec_error[j]
    fig, (ax1, ax2) = plt.subplots(1, 2, figsize=(16,3))
    ax1.plot(wl, spectrum_1, lw=1, c="navy")
    ax1.set_title(f"Spectrum of star {target_ids[j]}")
    ax11 = ax1.twinx()
    ax11.scatter(wl, spec_error_1, s=0.4, c="crimson");

    ax2.plot(wl, spectrum_1, lw=0.5, c="navy")
    ax2.set_title(f"Spectrum of star {target_ids[j]}")
    ax2.set_ylim(0.5*np.median(spectrum_1), 1.8*np.median(spectrum_1))
    ax22 = ax2.twinx()
    ax22.scatter(wl, spec_error_1, s=0.4, c="crimson");
    ax22.set_ylim(0.5*np.median(spec_error_1), 2*np.median(spec_error_1))

2.5 Finding consistent band sizes

We don't really care for the areas outside the bands and establishing where the bands end is important. This becomes especially important when we observe that the bands end at different wavelengths for different spectra. This is probably because our data has been doppler shifted to a common velocity frame for all visits over all spectra. This results in a shift in the spectrum, in different directions for each visit, which leads to the bands not always corresponding.

Although APOGEE has tagged most out-of-band areas with high error, it tends to stop before the band actually begins. To deal with this mess, we will employ two corrections to the data:

  • We will set the error of all very low flux areas to an extremely high value, expanding on what APOGEE has done. We have set this cutoff limit to all fluxes below 5% of the median flux.
  • We will define a conservative limit on where the bands actually lie and ignore all values falling outside this range for in all our models. For this range, we will employ the same range used in APOGEEs pseudo-continuum normalization, which consists of 7214 of the 8575 original pixels.

The latter correction will mean cutting off some actually good data in some spectrums, but there is still plenty to work with, and this improves consistency throughout our spectrums. Some hindsight reveals that cutting slightly into each band improves the final model.

In [0]:
spec_error[spec < 0.05*np.median(spec, axis=0)] = 1e10

band_idxs = np.array([[322, 3241], [3648, 6047], [6412, 8305]])
band_cut = np.zeros(nr_pixels, dtype=bool)
band_cut[band_idxs[0,0]:band_idxs[0,1]+1] = True
band_cut[band_idxs[1,0]:band_idxs[1,1]+1] = True
band_cut[band_idxs[2,0]:band_idxs[2,1]+1] = True

band_idxs_reduced = np.array([[0, 2919], [2920, 5319], [5320, 7213]])

2.6 Label Extraction

The AllStars file provided by APOGEE contains labels derived by ASPCAP. The labels of interest are:

  • Teff: Effective Temperature.
  • Log(g): Natural logarithm of surface gravity.
  • Fe/H: Iron to Hydrogen ratio.
  • Mg/Fe: Magnesium to Iron ratio.
  • Si/Fe: Silisium to Iron ratio.

The labels are stored by location and object IDs, which we have, so we simply parse the file and extract the following labels for each star. In addition, the file contains the SNR(Signal to Noise Ratio) information for each star.

Some stars show up more than once (identical target_ids) in the ASPCAP file. This might be due to multiple observations, that for some reason aren't counted as multiple visits. Either way, for simplicity we will choose whatever file shows the lower SNR, as that is a naive indication of better data.

In [8]:
label_names = ["Teff", "Log(g)", "Fe/H", "Mg/Fe", "Si/Fe"]
label_xlims = [[3200, 5700],[-0.5, 4.5], [-1.2, 0.8], [-0.4, 0.6], [-0.4, 0.5]]
labels = np.zeros((nr_stars, 5), dtype=np.float32)
label_errors = np.zeros((nr_stars, 5), dtype=np.float32)
SNR_data = np.zeros(nr_stars, dtype=np.float32)

with fits.open('stars/allStar-l31c.2.fits') as label_file:
    label_data = label_file[1].data
    duplicates = 0
    for i in tqdm(range(nr_stars)):
        target_id = loc_ids[i] + '.' + obj_ids[i]
        matches =  np.argwhere(label_data['TARGET_ID'] == target_ids[i])
        if len(matches) > 1: # Some har multiple TARGET_IDs. Just pick the one with highest SNR.
            duplicates += 1
            idx = matches[np.argmax(label_data['SNR'][matches])]
        else:
            idx = matches[0]
        labels[i,0] = label_data['Teff'][idx]
        labels[i,1] = label_data['logg'][idx]
        labels[i,2] = label_data['FE_H'][idx]
        labels[i,3] = label_data['MG_FE'][idx]
        labels[i,4] = label_data['SI_FE'][idx]
        label_errors[i,0] = label_data['Teff_ERR'][idx]
        label_errors[i,1] = label_data['logg_ERR'][idx]
        label_errors[i,2] = label_data['FE_H_ERR'][idx]
        label_errors[i,3] = label_data['MG_FE_ERR'][idx]
        label_errors[i,4] = label_data['SI_FE_ERR'][idx]
        SNR_data[i] = label_data['SNR'][idx]

print(f"There were {duplicates} duplicates out of {nr_stars} stars.")
100%|██████████| 2723/2723 [00:55<00:00, 48.68it/s]
There were 292 duplicates out of 2723 stars.

3. Data Cleaning and Manipulation

3.1 Quality Cuts

Quite a few of the stars do not have ASPCAP derived labels, due to data quality issues. ASPCAP signals this by setting the label value to -9999. We have no use for these stars, and they will be removed. We will also remove stars with a signal to noise ratios of below 50, as well as very low metallicity stars (Fe/H < -1). A few stars have 0 flux all over the spectrum. These are also obviously useless and will be removed.

In [9]:
quality_cut_idx = (labels[:,0] != -9999.) & (labels[:,1] != -9999.) & (labels[:,2] != -9999.) \
                & (labels[:,3] != -9999.) & (labels[:,4] != -9999.) \
                & (SNR_data > 50) & (labels[:,2] > -1) & (spec != 0.).any(axis=1)


spec = spec[quality_cut_idx]
spec_error = spec_error[quality_cut_idx]
labels = labels[quality_cut_idx]
SNR_data = SNR_data[quality_cut_idx]
bitmask = bitmask[quality_cut_idx]
label_errors = label_errors[quality_cut_idx]

nr_stars = len(bitmask)
print(f"Cut reduced number of objects to {nr_stars}")
Cut reduced number of objects to 1692

3.1.1 A quick look at the label space

Below we have done a corner plot of the label space. The labels seem to have low covariance, with the exception of effective temperature and surface gravity, which have a very strong, seemingly linear trend. The atmospheric labels (Fe/H, Mg/Fe, and Si/Fe) also all seem to have some weaker correlation. Low covariance between the labels is a good thing for our model, as it means more linearly independent information can be constructed from them. If one label is simply a function of another(or several), it is not actually a new piece of information that can be used to predict stellar spectra.

In [10]:
fig = corner.corner(labels, labels=label_names, quantiles=[0.16, 0.84]);

3.2 Pseudo-Continuum Normalization

3.1.1 Basic idea

To be able to compare flux measurements for different stars, we need some sort of relative flux scale that holds for every star. We are simply interested in the relative absorption line strength - how large fraction of the star's flux is reduced by the absorption line at every wavelength. This requires us to know what the stars flux/wavelength relation looked like without absorption. One option is to derive a theoretical line, from first principles. This would require large assumptions on several important aspects of the star, combined with the fact that we would have to blindly trust that our labels are accurate (and sufficient to form such a model).

An alternative, more data-driven approach is suggested in Ness et al., section 2.3. The approach involves picking out pixels known to be less affected by absorption lines. These pixels should represent a fairly good continuum of the spectrum. This paper extracts these points by running The Cannon iteratively, first with a more naive pseudo-continuum, to pick out points weakly affected by absorption.

We will employ our own pseudo-continuum normalization, using the less affected pixels provided by The Cannon. They are loaded from the "pixels_cannon.npy" file. Using this, we will, as suggested in Ness et al., fit a second order polynomial to each of the three wavelength bands. Exhaustive testing of higher order polynomials revealed that second order across all three bands provides the best result. The polynomials will also be weighted by the squared inverse of their error. A "wavelength.npy" file, containing the wavelengths of the pixels (as they do not sync entirely up with ours) is also loaded. Before we fit the second order polynomial to the chosen pixels, we will also attempt to remove large outliers, by running through the pixels in bulks of 20, and removing anything that falls very far from the median of those 20 pixels.

The Cannon arrays contain only 7214 pixels, against the total 8575 pixels provided in the APOGEE datasets. This is the reduced set of pixels that we consider to actually lie on the bands, and we will employ only these in our models. Everything containing only these 7214 pixels will be referred to as "reduced" in the code.

In [0]:
spec_reduced = spec.copy()[:,band_cut]
spec_error_reduced = spec_error.copy()[:,band_cut]
In [0]:
wl_reduced = np.load("wavelength.npy")  # Wavelength array of reduced pixels.
nr_pixels_reduced = len(wl_reduced)

clean_cut_reduced = np.load("pixels_cannon.npy")  # Boolean cut array for "clean" pixels at reduced wavelengths.
nr_clean_pixels = np.sum(clean_cut_reduced)
wl_clean = wl_reduced[clean_cut_reduced]  # Wavelength aray for "clean" pixels.
nr_clean_pixels

clean_cut = np.zeros(nr_pixels, dtype=bool)  # Boolean cut array for "clean" pixels at full 8575 pixel spectrum.
for i in range(3):
    clean_cut[band_idxs[i,0]:band_idxs[i,1]] = clean_cut_reduced[band_idxs_reduced[i,0]:band_idxs_reduced[i,1]]

3.1.2 Setting up and testing the pseudo-continuum normalization

Below, the outlier and polyfit functions are defined, as explained below. We have chosen a 10% distance from the median tolerance limit on the outlier algorithm for each batch-size of 20. The result of this outlier removal is plotted below, where removed outliers are marked in red. As we can see, the algorithm is relatively effective. It could always be further improved, but the impact is probably not too substantial.

In [0]:
def find_outliers(data):
    # Iterates through 1D "data" object in bulks of 20 points, tagging datapoints lying outside 10% of median.
    # Returns boolean array with False tag on outliers.
    not_outliers = np.ones(len(data), dtype=bool)
    group_points = 20
    iterations = int(len(data)/group_points)
    for i in range(iterations-1):
        subdata = data.copy()[i*group_points:(i+1)*group_points]
        not_outliers[i*group_points:(i+1)*group_points] = np.abs(subdata - np.median(subdata))/np.median(subdata) < 0.1
    subdata = data.copy()[(iterations-1)*group_points:]
    not_outliers[(iterations-1)*group_points:] = np.abs(subdata - np.median(subdata))/np.median(subdata) < 0.1
    return not_outliers


poly_band_indexes = [[0, 3450],[3450, 6200], [6200, 8750]]  # Intervals to perform polyfit on.

def get_band_polyfit(wl, spec, spec_err, clean_cut=clean_cut, degrees=(2,2,2)):
    # Calculates the polynomial fit of chosen "clean_cut" pixels of spec(wl) in three bands,
    # with chosen degrees, considering errors.
    polyfit = np.zeros(nr_pixels)
    for b in range(3):  # One polyfit per band.
        degree = degrees[b]
        # Keeping only current band.
        band_wl = wl[poly_band_indexes[b][0] : poly_band_indexes[b][1]]
        band_spec = spec[poly_band_indexes[b][0] : poly_band_indexes[b][1]]
        band_err = spec_err[poly_band_indexes[b][0] : poly_band_indexes[b][1]]

        # Keeping only clean pixels.
        band_wl_clean = band_wl[clean_cut[poly_band_indexes[b][0] : poly_band_indexes[b][1]]]
        band_spec_clean = band_spec[clean_cut[poly_band_indexes[b][0] : poly_band_indexes[b][1]]]
        band_err_clean = band_err[clean_cut[poly_band_indexes[b][0] : poly_band_indexes[b][1]]]

        # Keeping only non-outliers.
        outlier_cut = find_outliers(band_spec_clean)
        band_wl_clean2 = band_wl_clean[outlier_cut]
        band_spec_clean2 = band_spec_clean[outlier_cut]
        band_err_clean2 = band_err_clean[outlier_cut]
        
        p = np.polyfit(band_wl_clean2, band_spec_clean2, deg=degree, w=1/band_err_clean2)
        poly = np.zeros(len(band_wl))
        for i in range(degree+1):
            poly += p[-i-1]*band_wl**i
        polyfit[poly_band_indexes[b][0] : poly_band_indexes[b][1]] = poly
    return polyfit
In [14]:
fig, ax = plt.subplots(3, 2, figsize=(16,8))
for i, j in enumerate(np.random.randint(0, nr_stars, 6)):
    polyfit = get_band_polyfit(wl, spec[j], spec_error[j], degrees=(2,2,2))
    non_outlier_cut = find_outliers(spec[j][clean_cut])
    ax[i//2, i%2].scatter(wl_clean, spec[j][clean_cut], s=4, c="red")
    ax[i//2, i%2].scatter(wl_clean[non_outlier_cut], spec[j][clean_cut][non_outlier_cut], s=4, c="navy")
    ax[i//2, i%2].plot(wl, polyfit, c="g", lw=2)

3.3.3 Normalize them all!

Below we have employed our pseudo-continuum normalization on all stars. The effect is plotted for a handful of randomly chosen stars. On the left-hand side, we see the original spectrum, overplotted with the fitted polynomial. In the middle, we see the pixels used in the fit, and on the right, we see the final resulting normalized spectrum. The normalized spectrum all in all look very good, while there are some features that can be worrisome. Especially towards the edged of the bands, the spectrum tends to diverge a bit. The normalization algorithm could definitively be improved for better edge-handling.

In [15]:
polynomial_fits = np.zeros((nr_stars, nr_pixels))
for j in tqdm(range(nr_stars)):
    polyfit = get_band_polyfit(wl, spec[j], spec_error[j], degrees=(2,2,2))
    if np.sum(np.isnan(polyfit)) > 0:
        print(np.sum(np.isnan(polyfit)))
    if np.sum(np.isinf(polyfit)) > 0:
        print(np.sum(np.isinf(polyfit)))

    polynomial_fits[j] = polyfit
    if j %387 == 0:
        fix, (ax1, ax2, ax3) = plt.subplots(1, 3, figsize=(18,4))
        ymax, ymin = np.max(spec[j][clean_cut])*1.1, np.min(spec[j][clean_cut])*0.9
        ax1.scatter(wl, spec[j], s=0.5, color="navy")
        ax1.plot(wl, polyfit, c="g", lw=3)
        ax2.plot(wl, polyfit, c="g", lw=3)
        ax2.scatter(wl_clean[non_outlier_cut], spec[j][clean_cut][non_outlier_cut], s=2, c="navy")
        ax1.set_ylim(ymin, ymax); ax2.set_ylim(ymin, ymax);
        ax3.scatter(wl, spec[j]/polyfit, s=0.5, c="navy")
        ax3.axhline(y=1, c="g", lw=3)
100%|██████████| 1692/1692 [00:09<00:00, 179.68it/s]

Applying the normalization to all data:

In [0]:
spec_norm = spec.copy()/polynomial_fits
spec_error_norm = spec_error.copy()/polynomial_fits
spec_error_norm[spec_error_norm > 1e4] = 1e10  # Putting flagged high-error objects back to 1e10, for convenience.

3.3 Applying bitmasks on bad data

APOGEE provides a "possible error" bitmask for each pixel of each star. The meaning of each bit is explained here, but they all indicate that something might be wrong with the pixel. The idea is to set the standard error of all pixels we suspect might be affected by some error-inducing problem to a very large value, just like we did with the zero-flux pixels.

Below we have plotted the distribution of different bitmask flags. As we can see, only 7 flags are actually active in our data, and 4 of these account for most cases. The plot does not account for overlap (one pixel may have more than one flag), but we found that very few pixels have more than one flag. The 0-flag seems to correspond to pixels marked with 0 flux, which we have already marked, so it will not contribute anything.

Now, some flags are less serious than others, and there is an argument for treating them differently. Some experimenting does, however, show that simply filtering out pixels with any of the flags yields the best final predictive capabilities for our model. We seem to have enough data to go on that it is beneficial to be more critical of potential errors. If we had substantially less data, there might have been a better payoff for discarding fewer suspicious pixels.

We will simply assume that any pixel with non-zero mask value (which means no flag is set) is bad, and their standard errors will be set accordingly. As seen in the code-prints below, this only removes ~16.5% of our on-band pixels (i.e. not accounting for the pixels outside the bands, which we won't use).

In [17]:
bitmask_unpacked = np.zeros((nr_stars, nr_pixels, 15), dtype=bool)

for i in range(15):
    bitmask_unpacked[:, :, i] = ((bitmask & 2**i) != 0)
    
bitmask_count = np.sum(bitmask_unpacked, axis=(0,1))
plt.figure(figsize=(12,4))
plt.bar(range(15), bitmask_count, color="navy")
print(f"Out of a total of {bitmask[:,band_cut].size} pixels, {np.sum(bitmask[:,band_cut] != 0)} has at least one error bitflag, corresponding to {100*np.sum(bitmask[:,band_cut] != 0)/bitmask[:,band_cut].size:.2f} %")

spec_error_norm[bitmask != 0] = 1e10  # Discarding bad pixels.
Out of a total of 12206088 pixels, 2009042 has at least one error bitflag, corresponding to 16.46 %

3.3.1 A final look at data errors.

We have just increased the error of a lot of pixels, and need to do a short analysis of what kind of data we are left with. The plots below of some random spectra and their errors makes it evident that some stars were hit much harder by the bitmask than others, and might not be worth keeping at all. Below this again, we have plotted a histogram of the number of pixels tagged as unusable for each star (not including the pixels tagged as unusable because they are out-of-band, but only on-band pixels). It reveals that many stars have a substantial amount of pixels tagged. These stars are of no use to our model, and we have decided to discard all stars with more than 3000 unusable pixels (corresponding to more than 41.5%).

In [18]:
fig, ax = plt.subplots(3, 3, figsize=(18, 10))
plot_nrs = np.random.randint(0, nr_stars, 9)
for i in range(9):
    ax[i//3, i%3].set_title(f"{plot_nrs[i]}")
    ax[i//3, i%3].plot(wl, spec_norm[plot_nrs[i]], c="darkblue")
    ax2 = ax[i//3, i%3].twinx()
    ax2.scatter(wl, spec_error_norm[plot_nrs[i]], s=1, c="firebrick")
In [0]:
spec_norm_reduced = spec_norm[:,band_cut]
spec_error_norm_reduced = spec_error_norm[:,band_cut]
In [20]:
x = np.sum(spec_error_norm_reduced > 1e8, axis=1)
plt.hist(x, bins=100, color="navy")
spec_norm_reduced2 = spec_norm_reduced[np.sum(spec_error_norm_reduced > 1e8, axis=1)<3000]
spec_error_norm_reduced2 = spec_error_norm_reduced[np.sum(spec_error_norm_reduced > 1e8, axis=1)<3000]
labels2 = labels[np.sum(spec_error_norm_reduced > 1e8, axis=1)<3000]
nr_stars_final = spec_norm_reduced2.shape[0]
print(f"Removing {np.sum(x>3000)} high-error stars, reducing number of usable stars from {nr_stars} to {nr_stars_final}.")
Removing 176 high-error stars, reducing number of usable stars from 1692 to 1516.
In [0]:
good_pixels = spec_error_norm_reduced2 < 1e9  # Boolean error tagging pixels as usable or unusable (very high error).

4. The Spectrum Prediction Model

With all data-cleaning out of the way, we are finally ready to build our first predictive model.

4.1 Setting up the data

4.1.1 Dividing in two

Sklearn has a method for splitting a random selection of input data into two randomly selected groups. Below, we employ this to get a training set with 67% of the data, and a testing set with 33% of the data. This results in around a thousand stars for the training set, and five hundred for the testing set. Both should be respectable enough sizes to get a fair model.

In [22]:
from sklearn import preprocessing
from sklearn.model_selection import train_test_split
from sklearn.metrics import mean_squared_error, mean_absolute_error

labels_train, labels_test, spectrum_train, spectrum_test, error_train, error_test = train_test_split(labels2, spec_norm_reduced2, spec_error_norm_reduced2, test_size=0.33, random_state=42, shuffle=True)
good_pixels_train = error_train < 1e9
good_pixels_test = error_test < 1e9
nr_training_stars, nr_testing_stars = labels_train.shape[0], labels_test.shape[0]
print(f"Divided data into training set of {nr_training_stars} objects, and testing set of {nr_testing_stars} objects.")
Divided data into training set of 1015 objects, and testing set of 501 objects.

4.1.2 Rescaling Labels

Looking at the raw numbers of our labels in their respective units, it is clear they are of different magnitudes. Especially the effective temperature sticks out, being over a factor 1000 larger than the other values. We will simply scale all our labels to mean 0 and variance 1, using sklearn's "StandardTransformer" module, which performs the scaling:

$$ \theta' = \frac{\theta - \bar{\theta}}{\sigma_\theta} $$

Such scaling is useful both for numerical precision in calculations, the convergence rate of learning models, and a more meaningful comparison of values between labels. Below we have plotted the distributions of labels in the training set, before(above) and after(below) scaling.

In [0]:
# pt = preprocessing.PowerTransformer()
Scaler = preprocessing.StandardScaler()
labels_train_scaled = Scaler.fit_transform(labels_train)
labels_test_scaled = Scaler.transform(labels_test)
In [24]:
fig, ax = plt.subplots(2, 5, figsize=(18,8))
for i in range(5):
    ax[0,i].set_title(label_names[i]); ax[0,i].set_ylim(0, 220)
    ax[0,i].hist(labels_train[:,i], bins=40, color="navy")
    ax[1,i].hist(labels_train_scaled[:,i], bins=40, color="navy")
    ax[1,i].set_xlim(-4, 4); ax[1,i].set_ylim(0, 220)

4.2 Building the Model

We will now build a model for generating spectrum from a set of labels. Each star consist of a set of five labels $l_n = [Teff_n,...,SiFe_n]$. Consider each of the $N$ star to have a label-vector $\boldsymbol\ell_{n}$ with $K$ different labels. In our case, the label-vector will consist of our 5 labels, in addition to any function of them we would wish to create:

$\boldsymbol\ell_{n} = \left[Teff_n,\ logg_n,\ ...,\ f(Teff_n, logg_n...),\ ...\right]$.

For each wavelength $\lambda$, we will assume that the flux can be written as a linear combination of these labels, with a set of coefficients $\boldsymbol \theta_\lambda$ yet to be decided. For some star $n$, we would have

$$ f_{n\lambda} = \boldsymbol \ell_n \cdot \boldsymbol \theta_\lambda $$

where $\boldsymbol \theta_\lambda$, for each wavelength, is a vector of coefficients which have to be solved for.

Solving for all $N$ stars at once leaves this as a matrix equation to be solved: $$ \boldsymbol f_{\lambda} = \hat{\boldsymbol \ell} \cdot \boldsymbol \theta_\lambda $$

Ness et al., constructs a likelihood for this model, employing an optimizer to solve for the coefficients. Optimizers are computationally rather expensive, and we will simply solve this as a linear algebra exercise.

Now, in principle, all that remains is to solve the equation above for the coefficients. We would, however, like to also make use of our standard error estimates, to weight the different pixels. This will increase the accuracy and predictive power of our coefficients. For weights, we will simply use the square inverted standard error. We introduce the diagonal weight matrix

$$ W = \begin{pmatrix} 1/\sigma_0^2 &0& 0 & \cdots \\ 0 & 1/\sigma_1^2 & 0 & \cdots \\ \ddots & \ddots & \ddots & \ddots \end{pmatrix} $$

This expands the linear algebra problem we have to solve to

$$ \hat{\boldsymbol \ell}^T \boldsymbol W\, \boldsymbol f_{\lambda} =\hat{\boldsymbol \ell}^T \boldsymbol W\, \hat{\boldsymbol \ell}\, \boldsymbol \theta_\lambda $$ where everything but $\boldsymbol \theta_\lambda$ is still known, so it's still just a matrix equation.

Now for our choice label vectors, $\boldsymbol \ell_n$. We will, as suggested in Ness et al., use all quadratic combinations of our five stellar labels (the paper has three, but the principle is the same). Including a constant term, the labels vectors will consist of 21 elements:

$$ \boldsymbol \ell_n = [1,\ Teff_n,\ logg_n,\ FeH_n,\ MgFe_n,\ SiFe_n,\ Teff_n^2,\ ...\ ,\ SiFe_n^2,\ Teff_n\cdot logg_n,\ ...\ ,\ MgFe_n\cdot SiFe_n ] $$ for each star, $n$.

Combining them, we get a label matrix. This, as well as the least-square solving of all coefficients $\boldsymbol \theta_\lambda$ (also packed into a coefficient matrix), is done below.

In [0]:
label_means = np.mean(labels_train, axis=0)

label_matrix = np.zeros((nr_training_stars, 21))
label_matrix[:, 0] = 1
label_matrix[:, 1:6] = labels_train_scaled
label_matrix[:, 6:11] = labels_train_scaled**2
label_matrix[:, 11:15] = labels_train_scaled[:,0,None]*labels_train_scaled[:, 1:]
label_matrix[:, 15:18] = labels_train_scaled[:,1,None]*labels_train_scaled[:, 2:]
label_matrix[:, 18:20] = labels_train_scaled[:,2,None]*labels_train_scaled[:, 3:]
label_matrix[:, 20] = labels_train_scaled[:,3]*labels_train_scaled[:, 4]
In [26]:
coefficient_matrix = np.zeros((nr_pixels_reduced, 21)) # The 21 coefficients of each pixel.

for i in tqdm(range(nr_pixels_reduced)):
    weights_matrix = np.diag( 1/error_train[:,i]**2 )
    pixel_vector = spectrum_train[:,i]
    coeff, res, rank, s = np.linalg.lstsq(label_matrix.T@weights_matrix@label_matrix, np.dot(label_matrix.T@weights_matrix, pixel_vector), rcond=None)
    coefficient_matrix[i] = coeff
100%|██████████| 7214/7214 [01:12<00:00, 93.76it/s]

Armed with these coefficients, and our labels, we can now write an explicit function for deriving a star-spectrum from its labels. This function is shown below.

In [0]:
def spectrum_fit(labels, coefficient_matrix=coefficient_matrix):
    # Returns the spectrum of a star with given labels, assuming coefficients are calculated.
    Teff, logg, FeH, MgFe, SiFe = labels
    const, a, b, c, d, e, aa, bb, cc, dd, ee, ab, ac, ad, ae, bc, bd, be, cd, ce, de = coefficient_matrix.T
    model = a*Teff       + b*logg       + c*FeH        + d*MgFe     + e*SiFe \
            + aa*Teff**2   + bb*logg**2   + cc*FeH**2    + dd*MgFe**2 + ee*SiFe**2 \
            + ab*Teff*logg + ac*Teff*FeH  + ad*Teff*MgFe + ae*Teff*SiFe \
            + bc*logg*FeH  + bd*logg*MgFe + be*logg*SiFe \
            + cd*FeH*MgFe  + ce*FeH*SiFe \
            + de*MgFe*SiFe \
            + const
    return model

4.2.1 Confirming convergence on training data

To convince ourselves that the model worked at all, we will reproduce some spectra in our training set with their respective labels. Below we see a plot of some real and predicted spectra for 4 randomly chosen stars at different wavelength intervals. We see that they agree comfortably well. The exceptions are where the pixels are marked with very high error, but we have basically told the model to not care about these at all. The RMS error is also acceptably small. Note that we never include contributions of "bad" pixels into RMSE calculations.

In [28]:
x_intervals = [[300, 800], [1300, 1800], [3600, 4100], [4100, 4600]]
fig, ax = plt.subplots(2, 2, figsize=(18,8))
for i in range(4):
    j = np.random.randint(0, 450)
    specfit = spectrum_fit(labels_train_scaled[j], coefficient_matrix=coefficient_matrix)
    ax[i//2, i%2].plot(specfit, c="navy", label="Model Spectrum")
    ax[i//2, i%2].set_ylabel("Flux"); ax[i//2, i%2].set_xlabel("Wavelength")
    ax[i//2, i%2].plot(spectrum_train[j], ls="--", c="g", label = "APOGEE Spectrum")
    ax2 = ax[i//2, i%2].twinx(); ax2.set_ylabel("Flux Error")
    ax2.plot(error_train[j]/1e10, c="crimson", label="APOGEE Flux Error")
    ax[i//2, i%2].set_title(f"Star {target_ids[j]} with RMSE = {np.sqrt(mean_squared_error(spectrum_train[j], specfit, sample_weight=good_pixels_train[j])):.4f}")
    ax[i//2, i%2].set_xlim(4000, 4500)
    ax[i//2, i%2].set_ylim(0.5, 1.5)
fig.tight_layout(); ax2.legend(); ax[-1,-1].legend();

5. Reversing the spectrum predicting model

5.1 Setting up a Maximum Likelihood Estimation of the Model

Armed with our spectrum-from-labels predicting model, we will reverse this into a labels-from-spectrum predicting model, which is the model we actually want. The idea is that, for each star, there should be a set of labels that bring the generated spectrum as "close" as possible to its actual spectrum.

A question arises on the definition of "closeness". The most obvious candidate for measuring closeness between two sets, and the one we will use, is in a chi-squared sense (A more linear distance is also a good candidate, and could be explored). From this, we can define a likelihood, as a function of labels, which needs to be maximized. This likelihood is rewritten to a log-likelihood, and inversed, such that we wish to minimize it. The (non-negative) chi-squared log-likelihood will then be

$$ \ln p\left(f_{n \lambda} | \boldsymbol{\theta}_{\lambda}, \boldsymbol \ell_{n}\right)=-\frac{1}{2} \frac{\left[f_{n \lambda}-\boldsymbol{\theta}_{\lambda}^{T} \cdot \boldsymbol\ell_{n}\right]^{2}}{\sigma_{n \lambda}^{2}}-\frac{1}{2} \ln \left(\sigma_{n \lambda}^{2}\right) $$

We have below defined the negative of such a log-likelihood, which includes the assumed errors in the pixels. As before, some pixels are tagged with extremely high errors, and will not contribute to the predictions.

In [0]:
def lnlike(labels, spec_actual, spec_error):
    y = spec_actual
    y_err = spec_error
    model = spectrum_fit(labels)
    inv_sigma2 = 1.0/(y_err**2)
    neg_ll = 0.5*(np.sum((y-model)**2*inv_sigma2 - np.log(inv_sigma2)))
    return neg_ll

5.2 Optimizing the Likelihood

All that remains is to pick an optimizer to minimize the negative log-likelihood. Scipy has a bunch of good options, and we went with the standard scipy.optimize.minimize, which defaults to the BFGS method. We initialize the likelihood with initial values of 0 across all labels, which is the mean values, after our scaling.

To our surprise, we found that many of the other methods were very unstable or refused to converge. Given that the likelihood surface is simply a sum of quadratic functions, we expected it to be rather well-behaved. Even though the BFGS method gives good results, it throws out a few warnings of "divide by zero", and "may not have converged due to precision loss". Further exploring method and convergence of the optimizer is definitively needed. It does, however, produce reliable results, so we leave it be for now.

In [30]:
labels_estimated_training_scaled = np.zeros((len(labels_train_scaled), 5))

for i in tqdm(range(len(labels_train_scaled))):
    spec_actual = spectrum_train[i]
    y_err = error_train[i]
    initial = (0,0,0,0,0)
    bounds = ((-5,5),(-5,5),(-5,5),(-5,5),(-5,5))
    res_1 = scipy.optimize.minimize(lnlike, initial, args=(spec_actual, y_err))
    labels_estimated_training_scaled[i] = res_1.x
np.save("labels_estimated_training_scaled.npy", labels_estimated_training_scaled)
100%|██████████| 1015/1015 [06:29<00:00,  3.84it/s]
In [0]:
labels_estimated_training_scaled = np.load("labels_estimated_training_scaled.npy")
In [32]:
labels_estimated_testing_scaled = np.zeros((len(labels_test_scaled), 5))
for i in tqdm(range(len(labels_test_scaled))):
    spec_actual = spectrum_test[i]
    y_err = error_test[i]
    initial = (0,0,0,0,0)
    bounds = ((-5,5),(-5,5),(-5,5),(-5,5),(-5,5))
    res_1 = scipy.optimize.minimize(lnlike, initial, args=(spec_actual, y_err))
    labels_estimated_testing_scaled[i] = res_1.x
np.save("labels_estimated_testing_scaled.npy", labels_estimated_testing_scaled)
100%|██████████| 501/501 [03:12<00:00,  3.10it/s]
In [0]:
labels_estimated_testing_scaled = np.load("labels_estimated_testing_scaled.npy")

5.2.1 Analysing the Results

Below, the results of the predicted labels are plotted against ASPCAP labels for the training set (left) and testing set (right). Bias and RMSE is also shown. The bias is simply mean label difference (predicted - ASCPCAP). Scaled RMSE is simply the RMSE of the scaled (0 mean, unit variance) labels, which should be somewhat comparable metric between different labels.

The results seem fairly good. A couple of things are immediately obvious. Teff, logg, and Fe/H, have much stronger correlation trends than Mg/Fe and Si/Fe. This indicates that the former 3 have a much stronger relationship with the shape of the spectrum than the latter 2. This can also be observed from the scaled RMSE. The bias is also relatively small, indicating no larger general off-shifts, but will be studied in more detail below.

A somewhat surprising observation is that the testing data is as good a fit as the training data. This is in some regard pleasing, at it means we haven't produced an overfitted model, which only mimics our data. It might, however, indicate that we could extract more information from our data, and build a more tightly fit model. Our approximations that the spectrum pixels are quadratic functions of the labels could be generalized to include higher order terms. This might further improve the predictive capabilities of our model.

In [0]:
labels_estimated_training = Scaler.inverse_transform(labels_estimated_training_scaled)
labels_estimated_testing = Scaler.inverse_transform(labels_estimated_testing_scaled)
In [35]:
fig, ax = plt.subplots(5, 2, figsize = (16,20))
for i in range(5):
    RMS = np.sqrt(mean_squared_error(labels_estimated_training[:,i], labels_train[:,i]))
    RMS2 = np.sqrt(mean_squared_error(labels_estimated_training_scaled[:,i], labels_train_scaled[:,i]))
    MAE = mean_absolute_error(labels_estimated_training[:,i], labels_train[:,i])
    bias = np.sum(labels_estimated_training[:,i] - labels_train[:,i])/nr_training_stars
    ax[i, 0].scatter(labels_train[:,i], labels_estimated_training[:,i], s=4, c="navy")
    ax[i, 0].plot([-100,6000],[-100,6000], c="y")
    ax[i, 0].set_xlim(*label_xlims[i]); ax[i, 0].set_ylim(*label_xlims[i])
    ax[i, 0].set_title(f"{label_names[i]} - Training Data\nRMSE = {RMS:.4f}  |  RMSE(scaled) = {RMS2:.4f}\nMAE={MAE:.4f}    |    BIAS = {bias:.4f}")
    ax[i, 0].set_xlabel("ASPCAP Input Labels"); ax[i, 0].set_ylabel("Model Predicted Labels")
    RMS = np.sqrt(mean_squared_error(labels_estimated_testing[:,i], labels_test[:,i]))
    RMS2 = np.sqrt(mean_squared_error(labels_estimated_testing_scaled[:,i], labels_test_scaled[:,i]))
    MAE = mean_absolute_error(labels_estimated_testing[:,i], labels_test[:,i])
    bias = np.sum(labels_estimated_testing[:,i] - labels_test[:,i])/nr_training_stars
    ax[i, 1].scatter(labels_test[:,i], labels_estimated_testing[:,i], s=4, c="navy")
    ax[i, 1].plot([-100,6000],[-100,6000], c="y")
    ax[i, 1].set_xlim(*label_xlims[i]); ax[i, 1].set_ylim(*label_xlims[i])
    ax[i, 1].set_title(f"{label_names[i]} - Testing Data\nRMSE = {RMS:.4f}  |  RMSE(scaled) = {RMS2:.4f}\nMAE={MAE:.4f}    |    BIAS = {bias:.4f}")
    ax[i, 1].set_xlabel("ASPCAP Input Labels"); ax[i, 1].set_ylabel("Model Predicted Labels")
fig.tight_layout()

Below we have plotted the difference between predicted and ASPCAP labels, with ASPCAP values along the x-axis, to reveal trends in the data. The most obvious bias is that the model substantially under-predicts the effective temperature for high-temperature stars. This trend is also very visible in the take-one-out validation performed by the Cannon in Ness et al. 2018. The surface gravity seems to have a similar (but weaker) trend. This is unsurprising, as we concluded Log(g) and Teff to have a correlation in our initial label analysis. Fe/H also seems to be over-biased by the model in very low metalicities, although the data-points are few. The remaining labels show no obvious biases.

In [36]:
fig, ax = plt.subplots(3, 2, figsize=(16, 12))
for i in range(5):
    ax[i//2, i%2].scatter(labels_train[:,i], labels_estimated_training[:,i] - labels_train[:,i], s=4, c="navy")
    ax[i//2, i%2].axhline(y=0, c="y")
    ax[i//2, i%2].set_title(f"{label_names[i]} - Testing Data")
    ax[i//2, i%2].set_xlabel(f"ASPCAP Input Labels")
    ax[i//2, i%2].set_ylabel(f"Predicted - ASPCAP")
ax[-1, -1].axis('off')
plt.tight_layout();

5.3 A Kiel Diagram of the Results

Below we have plotted a Kiel diagram of the ~500 stars in the testing set. The data form a line, with decreasing Fe/H in up-left direction, in other words with increasing Teff and decreasing Log(g). There are some notable outliers to this trend, which indicates some suboptimal datapoints. On top, we have plotted two SDSS 6 Gyr isochrones generated by MIST, at Fe/H 0 and -1. Looking at the gradient of metallicity in our data, the two isochrones corresponds to the Fe/H trend we see. The Fe/H = -1 isochrone represent the outmost extreme of metallicity in our data, where we find few data points. The Fe/H = 0 seems to fall well onto the band of stars with the same metalicity. Our Kiel diagram does, in other words, look quite promising.

The absence of dwarf-stars can also be noted.

Below, we have also plotted the same plot, using the same stars, but this time with labels from the ASPCAP pipeline. We see no major different trends, and the occasional outlier is present here as well.

A comparable, more complete plot, derived from ASCPCAP labels, can be found in Holtzman et al. (2015), figure 12.

In [0]:
def get_isochrone(filename):
    """Returns an isochrone (bp_rp & mg) from a MIST generated .cmd isochrone file."""
    Teff = []
    logg = []
    with open(filename, "r") as infile:
        for i in range(13):  # Skipping info lines.
            infile.readline()
        for line in infile:
            words = line.split()
            Teff.append(float(words[4]))
            logg.append(float(words[5]))
    Teff = 10**np.array(Teff)
    logg = np.array(logg)
    return Teff, logg
In [0]:
Teff, logg = get_isochrone("MIST_iso_5c91f8dae5167.iso.cmd")
Teff2, logg2 = get_isochrone("MIST_iso_5c9460fc80da6.iso.cmd")
In [39]:
plt.figure(figsize=(14,6))
plt.scatter(labels_estimated_testing[:,0], labels_estimated_testing[:,1], c=labels_estimated_testing[:,2], s=12, cmap="gnuplot", vmin=-0.8, vmax=0.5)
plt.plot(Teff, logg, c="y", ls="--", label="6Gyr Isochrone, Fe/H = 0")
plt.plot(Teff2, logg2, c="k", ls="--", label="6Gyr Isochrone, Fe/H = -1")
cbar = plt.colorbar(); cbar.set_label("Fe/H")
plt.ylabel("Log(g)"); plt.xlabel("Teff"); plt.title("Kiel Diagram from Predicted Labels");
plt.ylim(6, -0.5); plt.xlim(6000, 3000);

plt.figure(figsize=(14,6))
plt.scatter(labels_test[:,0], labels_test[:,1], c=labels_test[:,2], s=12, cmap="gnuplot", vmin=-0.8, vmax=0.5)
plt.plot(Teff, logg, c="y", ls="--", label="6Gyr Isochrone, Fe/H = 0")
plt.plot(Teff2, logg2, c="k", ls="--", label="6Gyr Isochrone, Fe/H = -1")
cbar = plt.colorbar(); cbar.set_label("Fe/H")
plt.ylabel("Log(g)"); plt.xlabel("Teff"); plt.title("Kiel Diagram from ASPCAP Labels");
plt.ylim(6, -0.5); plt.xlim(6000, 3000);

5.4 Fitting an Unknown Spectrum with MCMC

5.4.1 Setup

We will now attempt to employ our model on an unknown spectrum. Instead of an optimizer, we will use MCMC. This will admittedly be must slower, but offer the advantage of much more information about the probability distributions of each parameter. The spectrum, with its provided error, is plotted below.

We define a chi-squared log-likelihood, with considerations of the errors we have been provided.

$$ \ln p\left(f_{n \lambda} | \boldsymbol{\theta}_{\lambda}, \boldsymbol \ell_{n}\right)=-\frac{1}{2} \frac{\left[f_{n \lambda}-\boldsymbol{\theta}_{\lambda}^{T} \cdot \boldsymbol\ell_{n}\right]^{2}}{\sigma_{n \lambda}^{2}}-\frac{1}{2} \ln \left(\sigma_{n \lambda}^{2}\right) $$

Since the labels are scaled, we will simply set an uninformed prior of $\pm 5$ on all labels, corresponding to 5 standard deviations in each direction.

$$ \ln{P(\boldsymbol l_n)} = \left\{\begin{array}{lr} 0, & \text{for } -5\leq \boldsymbol l_n \leq 5 \\ -\infty, & \text{otherwise } \end{array}\right\} $$

We employ 6 walkers, initiated at small deviations from zero, which is the mean of all labels. They are run for 15000 iterations.

In [40]:
with fits.open("mystery_spec_wiped.fits") as infile:
    mystery_spec = (infile[1].data)
    mystery_spec_error = (infile[2].data)
    mystery_bitmask = (infile[3].data)

polyfit = get_band_polyfit(wl, mystery_spec, mystery_spec_error, degrees=(2,2,2))

mystery_spec /= polyfit
mystery_spec_error /= polyfit

fig, ax1 = plt.subplots(1, figsize=(12,4))
ax1.plot(mystery_spec, label="Flux", c="navy")
ax2 = ax1.twinx()
ax2.plot(mystery_spec_error, ls="--", c="crimson", label="Flux Error")
ax2.set_ylim(0, 0.02); ax1.set_xlabel("Wavelength"); ax1.set_ylabel("Flux"); ax2.set_ylabel("Flux Error")
ax1.legend(); ax2.legend(); ax1.set_title("Mystery Spectrum")

mystery_spec = mystery_spec[band_cut]
mystery_spec_error = mystery_spec_error[band_cut]
In [0]:
def lnlike(labels, spec_actual, spec_error):
    y = spec_actual
    y_err = spec_error
    model = spectrum_fit(labels)
    inv_sigma2 = 1.0/(y_err**2)
    ll = -0.5*(np.sum((y-model)**2*inv_sigma2 - np.log(inv_sigma2)))
    return ll


def lnprior(labels):
    Teff, logg, FeH, MgFe, SiFe = labels
    if -5.0 <= Teff <= 5.0 and -5.0 <= logg <= 5.0 and -5.0 < FeH < 5.0 and -5.0 < MgFe < 5.0 and -5.0 < SiFe < 5.0:
        return 0.0
    return -np.inf

def lnprob(labels, y, yerr):
    lp = lnprior(labels)
    if not np.isfinite(lp):
        return -np.inf
    return lp + lnlike(labels, y, yerr)
In [44]:
seed = 1995
np.random.seed(seed)
ndim, nwalkers = 5, 16
pos = [(0,0,0,0,0) + 1e-4*np.random.randn(ndim) for i in range(nwalkers)]

import emcee
print(f"emcee version = {emcee.__version__}")
x = labels_train
y_data = mystery_spec
y_err = mystery_spec_error
sampler = emcee.EnsembleSampler(nwalkers, ndim, lnprob, args=(y_data, y_err))
emcee version = 2.2.1
In [0]:
niterations = 15000
sampler.reset()
sampler.run_mcmc(pos, niterations);

5.4.2 Convergence

Below, the label-values of each walker is plotted for each iteration. It is clear that the walkers reached some steady values relatively early on. Below, we also have the log-probability plotted over each step. We are confident that our results have converged. To leave the results unaffected by initial conditions, we implement a burn-in, cutting off all sampler values before 8000 iterations were reached.

In [46]:
fig, ax = plt.subplots(1, 5, figsize=(16,5))

for i,j in enumerate(sampler.chain[0,0,:]):
    ax[i].plot(sampler.chain[:,:,i].T, alpha=0.8, lw=0.5)
fig, ax = plt.subplots(1, 2, figsize=(16,4))
ax[0].set_ylabel('lnP'); ax[0].set_xlabel('step'); ax[0].set_title('lnP vs. step number');
ax[1].set_ylabel('lnP'); ax[1].set_xlabel('step'); ax[1].set_title('lnP vs. step number');
ax[0].plot(sampler.lnprobability.T);
ax[1].plot(sampler.lnprobability.T);
ax[1].set_ylim(-20500, -20100)
ax[1].axvline(x=8000, ls="--", c="y");
In [0]:
burn_in = 8000
samples = sampler.chain[:, burn_in:, :].reshape((-1, ndim))

5.4.3 Results

In the corner plot, we observe some of the same correlations as in our original analysis of all labels. There are also some notable differences, especially the strong relationship between Fe/H and Logg(g).

Looking at our printouts of derived values, with errors, below the graphs, we see that all labels fall in reasonable (but completely unconfirmable) ranges. We have also included the average error in each label reported by the ASPCAP pipeline. A difference is very apparent, where the ASPCAP values are orders of magnitude larger. Our errors are the standard deviation found by the MCMC walkers when traveling around our log-likelihood space. This is not at all representative for the actual inaccuracy of our prediction, because it assumes our log-likelihood is absolutely correct, which depends on our model and coefficient being without error altogether. Error-sources outside the immediate uncertainties in the log-likelihood is not included in our estimate, and the ASPCAP values probably represent a much more realistic estimate.

In [48]:
fig = corner.corner(samples, quantiles=[0.16, 0.84], labels=label_names);
In [49]:
avg_label_errors = np.mean(label_errors, axis=0)

mystery_labels_scaled = np.percentile(samples, 50, axis=0)
mystery_labels = Scaler.inverse_transform(mystery_labels_scaled)
labels_perc = Scaler.inverse_transform(np.percentile(samples, [16, 50, 84], axis=0))
v = labels_perc[0], labels_perc[1]-labels_perc[0], labels_perc[2] - labels_perc[1]
print(f"          Predicted Values     +std       -std     ASCPCAP std")
for i in range(5):
    print(f"{label_names[i]:6s} = {v[0][i]:14.4f}    |  {v[1][i]:6.4f}  |  {v[2][i]:6.4f}  |  {avg_label_errors[i]:8.4f}")
          Predicted Values     +std       -std     ASCPCAP std
Teff   =      4489.7657    |  0.8511  |  0.8941  |   74.4333
Log(g) =         1.9600    |  0.0022  |  0.0023  |    0.0531
Fe/H   =        -0.5213    |  0.0009  |  0.0009  |    0.0084
Mg/Fe  =         0.2120    |  0.0011  |  0.0011  |    0.0200
Si/Fe  =         0.2051    |  0.0010  |  0.0009  |    0.0253

5.5 Metalicity Analysis

A large contributing factor to the spectral absorption lines is the metallicity, Fe/H. Below, we have used the spectral model we have built to animate how the spectrum changes with metallicity. We create a spectrum with mean label values for everything but Fe/H. We then vary Fe/H from -1 to 0.5, and animate how the spectrum responds, for wavelength between 16000 and 16200 Angstrom.

As we can see, the absorption lines visible at -1 Fe/H becomes more and prominent towards higher values. I assume this indicates a more diverse atmosphere, with more non-H/He elements, and therefore more possibilities for wavelengths of absorption.

In [50]:
import numpy as np
import matplotlib.pyplot as plt
import matplotlib.animation as animation

FeH_list = np.linspace(-1, 0.5, 200)
labels = label_means
print("Spectrum Values:")
for i in [0,1,3,4]:
    print(f"{label_names[i]:8s} = {label_means[i]:10.4f}")

fig = plt.figure();
ax = plt.axes(xlim=(16000, 16200), ylim=(0, 1.5))
ax.set_ylabel("Normalized Flux")
ax.set_xlabel("Wavelength [Angstrom]")
line, = ax.plot([], [], lw=1, c="navy");

def init():
    line.set_data([], [])
    return line,

def animate(i):
    x = wl_reduced
    labels[2] = FeH_list[int(i)]
    labels_scaled = Scaler.transform(labels.reshape(1, -1))[0]
    y = spectrum_fit(labels_scaled)
    line.set_data(x, y)
    ax.set_title(f"Fe/H = {FeH_list[i]:.2f}")
    return line,

anim = animation.FuncAnimation(fig, animate, init_func=init,
                               frames=200, interval=50, blit=True);

anim.save('asdf.gif', writer='imagemagick', dpi=120);
plt.close(fig);
# anim.save('asdf.mp4', dpi=200);
MovieWriter imagemagick unavailable. Trying to use pillow instead.
Spectrum Values:
Teff     =  4597.6016
Log(g)   =     2.2849
Mg/Fe    =     0.0743
Si/Fe    =     0.0334
In [51]:
from IPython.display import HTML
HTML('<img src="asdf.gif">')
Out[51]:

5.6 Binary Star Considerations

Many observed stars might, in reality, be binary stars, which APOGEE has been incapable of separating because they are either far from earth or close to each other. Undetected binary stars will cause the observed spectrum to be a superposition of the two individual spectra, as flux-contributions from both stars will be measured. If such a binary system is fitted using a single-star spectral model, the derived labels will bias towards inaccurate values. El-Badry et al. 2018 explores this in detail and finds that such a wrongful fit will negatively bias the effective temperature and metallicity. This bias is diminished if the mass of the two stars are very different, as the smaller star will then contribute negligibly to the total spectrum.

The detection of such binary systems has also been studied in the literature. Often, when visual separation fails, one tries to detect binary systems by looking for doppler shifts in the spectrum, as the binary stars shift in velocities as they orbit each other. This is dependent on the period being reasonably small, which is often not the case. El-Badry et al. 2017 explores a data-driven approach on binary star detecting, using the Cannon. They find that, if the stars differ moderately ($ 0.4 < m_1/m_2 < 0.8$) in mass, one should find that the spectrum will fit a superposition of two single-star spectra better than a single-star spectrum model. Such an approach could be implemented in our model.

6. An Alternate Approach - Neural Network in Keras

6.1 Motivation and Introduction

Finally, we will present an entirely different data-driven model - a neural network. A neural network holds a series of both advantages and disadvantages so our model so far.

One advantage is that, while expensive to train, a neural network is very cheap to apply to new data. In contrast, in our earlier optimize-driven model, the most expensive part wasn't actually the training, but the optimizer fitting to new data. This can actually be a huge problem - training of a model is usually done with a rather limited set of data, while one often wishes to apply this model to data orders of magnitude larger in size. Another, at least superficial advantage, is the initial simplicity of a neural network. Simply putting the spectra into one end and the labels in the other yields acceptable results without much thinking.

As a counter, neural networks can be huge black boxes and hard to master. The architectural and hyperparameter configurations are endless, and often with little intuitive guidance. A neural network can also be incapable of effectively taking advantage of all information available, as with the flux errors in our case, as we soon will see.

6.2 Setup

We will be constructing our network with Google's Tensorflow, with Keras as a frontend. We will focus on standard sequential dense layers, with no huge surprises. We will later expand our dataset with some data augmentation.

The data is divided into training and testing sets of 67% and 33%, respectively, as before, and linearly scaled to zero mean and unit variance, as before.

A simple network of 5 hidden layers, of sizes 1000, 2000, 2000, 500, 200, and 20 is set up below. All with linear activation, which helped with certain convergence-issues. It was found that this layer size provided the best results within reasonable training time, and changing the layer sizes and numbers (within reason) did not noticeably change the final results. This presented a total of 13'623'325 trainable parameters. A batch size of 32 was chosen, together with 250 epochs, beyond which the results changed little. The Adam optimizer was used, with the amsgrad option enabled.

In [52]:
import tensorflow as tf
from tensorflow import keras
from keras import optimizers
from keras.models import Sequential
from keras.layers import Dense, Activation, LocallyConnected1D, Conv1D, Reshape
from keras.callbacks import ModelCheckpoint, History
from keras.wrappers.scikit_learn import KerasRegressor
from sklearn import preprocessing
from sklearn.model_selection import cross_val_score, KFold, train_test_split
from sklearn.metrics import mean_squared_error, mean_absolute_error

print(tf.__version__)
1.13.1
Using TensorFlow backend.
In [53]:
input_train, input_test, err_train, err_test, output_train, output_test = train_test_split(spec_norm_reduced2, spec_error_norm_reduced2, labels2, test_size=0.33, random_state=42, shuffle=True)

nr_training_stars, nr_testing_stars = input_train.shape[0], input_test.shape[0]
print(f"Divided data into training set of {nr_training_stars} objects, and testing set of {nr_testing_stars} objects.")

ScalerNN = preprocessing.StandardScaler()
ScalerNN.fit(output_train)
output_train_scaled = ScalerNN.transform(output_train)
output_test_scaled = ScalerNN.transform(output_test)
Divided data into training set of 1015 objects, and testing set of 501 objects.
In [54]:
fig, ax = plt.subplots(2, 5, figsize=(18,8))
for i in range(5):
    ax[0,i].set_title(label_names[i])
    ax[0,i].hist(output_train[:,i], bins=40)
    ax[1,i].hist(output_train_scaled[:,i], bins=40)
In [55]:
opti = optimizers.adam(amsgrad=True)#, decay=0.01)
def nn_clf():
    model = Sequential()
    model.add(Dense(1000, input_shape=(7214,), activation="linear", kernel_initializer='random_uniform'))
    model.add(Dense(2000,  activation="linear", kernel_initializer='random_uniform'))
    model.add(Dense(2000,  activation="linear", kernel_initializer='random_uniform'))
    model.add(Dense(200,  activation="linear", kernel_initializer='random_uniform'))
    model.add(Dense(20, activation="linear", kernel_initializer='random_uniform'))
    model.add(Dense(5, activation="linear", kernel_initializer='random_uniform'))
    model.compile(optimizer=opti, loss='mean_squared_error',  metrics=['mae',"mse"])
    return model
WARNING:tensorflow:From /usr/local/lib/python3.6/dist-packages/tensorflow/python/framework/op_def_library.py:263: colocate_with (from tensorflow.python.framework.ops) is deprecated and will be removed in a future version.
Instructions for updating:
Colocations handled automatically by placer.
In [56]:
nn_clf().summary()
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_1 (Dense)              (None, 1000)              7215000   
_________________________________________________________________
dense_2 (Dense)              (None, 2000)              2002000   
_________________________________________________________________
dense_3 (Dense)              (None, 2000)              4002000   
_________________________________________________________________
dense_4 (Dense)              (None, 200)               400200    
_________________________________________________________________
dense_5 (Dense)              (None, 20)                4020      
_________________________________________________________________
dense_6 (Dense)              (None, 5)                 105       
=================================================================
Total params: 13,623,325
Trainable params: 13,623,325
Non-trainable params: 0
_________________________________________________________________

6.3 Run and Results

As before, the predicted results are plotted against again ASPCAP labels below, with bias, RMSE, and MAE. As we can see, the networks , by all means, successful, making predictions of comparable quality to our former predictive model. It does, however, lag somewhat behind with a ~30% higher error.

In [57]:
clf = KerasRegressor(build_fn=nn_clf, batch_size=32, epochs=250)
clf.fit(input_train, output_train_scaled)
WARNING:tensorflow:From /usr/local/lib/python3.6/dist-packages/tensorflow/python/ops/math_ops.py:3066: to_int32 (from tensorflow.python.ops.math_ops) is deprecated and will be removed in a future version.
Instructions for updating:
Use tf.cast instead.
Epoch 1/250
1015/1015 [==============================] - 3s 3ms/step - loss: 24.5587 - mean_absolute_error: 3.0938 - mean_squared_error: 24.5587
Epoch 2/250
1015/1015 [==============================] - 1s 781us/step - loss: 1.1781 - mean_absolute_error: 0.8499 - mean_squared_error: 1.1781
Epoch 3/250
1015/1015 [==============================] - 1s 767us/step - loss: 1.0668 - mean_absolute_error: 0.8078 - mean_squared_error: 1.0668
Epoch 4/250
1015/1015 [==============================] - 1s 768us/step - loss: 0.9982 - mean_absolute_error: 0.7811 - mean_squared_error: 0.9982
Epoch 5/250
1015/1015 [==============================] - 1s 770us/step - loss: 0.9648 - mean_absolute_error: 0.7659 - mean_squared_error: 0.9648
Epoch 6/250
1015/1015 [==============================] - 1s 772us/step - loss: 0.9228 - mean_absolute_error: 0.7467 - mean_squared_error: 0.9228
Epoch 7/250
1015/1015 [==============================] - 1s 769us/step - loss: 0.8888 - mean_absolute_error: 0.7339 - mean_squared_error: 0.8888
Epoch 8/250
1015/1015 [==============================] - 1s 765us/step - loss: 0.8793 - mean_absolute_error: 0.7295 - mean_squared_error: 0.8793
Epoch 9/250
1015/1015 [==============================] - 1s 774us/step - loss: 0.9429 - mean_absolute_error: 0.7575 - mean_squared_error: 0.9429
Epoch 10/250
1015/1015 [==============================] - 1s 782us/step - loss: 0.7663 - mean_absolute_error: 0.6745 - mean_squared_error: 0.7663
Epoch 11/250
1015/1015 [==============================] - 1s 781us/step - loss: 0.7375 - mean_absolute_error: 0.6561 - mean_squared_error: 0.7375
Epoch 12/250
1015/1015 [==============================] - 1s 770us/step - loss: 0.7221 - mean_absolute_error: 0.6510 - mean_squared_error: 0.7221
Epoch 13/250
1015/1015 [==============================] - 1s 769us/step - loss: 0.6662 - mean_absolute_error: 0.6119 - mean_squared_error: 0.6662
Epoch 14/250
1015/1015 [==============================] - 1s 771us/step - loss: 0.6937 - mean_absolute_error: 0.6320 - mean_squared_error: 0.6937
Epoch 15/250
1015/1015 [==============================] - 1s 766us/step - loss: 0.6883 - mean_absolute_error: 0.6268 - mean_squared_error: 0.6883
Epoch 16/250
1015/1015 [==============================] - 1s 761us/step - loss: 0.6752 - mean_absolute_error: 0.6210 - mean_squared_error: 0.6752
Epoch 17/250
1015/1015 [==============================] - 1s 770us/step - loss: 0.7092 - mean_absolute_error: 0.6437 - mean_squared_error: 0.7092
Epoch 18/250
1015/1015 [==============================] - 1s 780us/step - loss: 0.6536 - mean_absolute_error: 0.6051 - mean_squared_error: 0.6536
Epoch 19/250
1015/1015 [==============================] - 1s 774us/step - loss: 0.6732 - mean_absolute_error: 0.6188 - mean_squared_error: 0.6732
Epoch 20/250
1015/1015 [==============================] - 1s 765us/step - loss: 0.6988 - mean_absolute_error: 0.6385 - mean_squared_error: 0.6988
Epoch 21/250
1015/1015 [==============================] - 1s 780us/step - loss: 0.6790 - mean_absolute_error: 0.6296 - mean_squared_error: 0.6790
Epoch 22/250
1015/1015 [==============================] - 1s 760us/step - loss: 0.6331 - mean_absolute_error: 0.5992 - mean_squared_error: 0.6331
Epoch 23/250
1015/1015 [==============================] - 1s 762us/step - loss: 0.6487 - mean_absolute_error: 0.6119 - mean_squared_error: 0.6487
Epoch 24/250
1015/1015 [==============================] - 1s 776us/step - loss: 0.6453 - mean_absolute_error: 0.6134 - mean_squared_error: 0.6453
Epoch 25/250
1015/1015 [==============================] - 1s 764us/step - loss: 0.6335 - mean_absolute_error: 0.5917 - mean_squared_error: 0.6335
Epoch 26/250
1015/1015 [==============================] - 1s 771us/step - loss: 0.5653 - mean_absolute_error: 0.5558 - mean_squared_error: 0.5653
Epoch 27/250
1015/1015 [==============================] - 1s 766us/step - loss: 0.5961 - mean_absolute_error: 0.5790 - mean_squared_error: 0.5961
Epoch 28/250
1015/1015 [==============================] - 1s 762us/step - loss: 0.5347 - mean_absolute_error: 0.5453 - mean_squared_error: 0.5347
Epoch 29/250
1015/1015 [==============================] - 1s 776us/step - loss: 0.5737 - mean_absolute_error: 0.5763 - mean_squared_error: 0.5737
Epoch 30/250
1015/1015 [==============================] - 1s 769us/step - loss: 0.6266 - mean_absolute_error: 0.6159 - mean_squared_error: 0.6266
Epoch 31/250
1015/1015 [==============================] - 1s 771us/step - loss: 0.4857 - mean_absolute_error: 0.5232 - mean_squared_error: 0.4857
Epoch 32/250
1015/1015 [==============================] - 1s 767us/step - loss: 0.4995 - mean_absolute_error: 0.5322 - mean_squared_error: 0.4995
Epoch 33/250
1015/1015 [==============================] - 1s 774us/step - loss: 0.5689 - mean_absolute_error: 0.5553 - mean_squared_error: 0.5689
Epoch 34/250
1015/1015 [==============================] - 1s 762us/step - loss: 0.5344 - mean_absolute_error: 0.5397 - mean_squared_error: 0.5344
Epoch 35/250
1015/1015 [==============================] - 1s 769us/step - loss: 0.4649 - mean_absolute_error: 0.4975 - mean_squared_error: 0.4649
Epoch 36/250
1015/1015 [==============================] - 1s 775us/step - loss: 0.4308 - mean_absolute_error: 0.4868 - mean_squared_error: 0.4308
Epoch 37/250
1015/1015 [==============================] - 1s 773us/step - loss: 0.3921 - mean_absolute_error: 0.4564 - mean_squared_error: 0.3921
Epoch 38/250
1015/1015 [==============================] - 1s 766us/step - loss: 0.5071 - mean_absolute_error: 0.5329 - mean_squared_error: 0.5071
Epoch 39/250
1015/1015 [==============================] - 1s 772us/step - loss: 0.4539 - mean_absolute_error: 0.4929 - mean_squared_error: 0.4539
Epoch 40/250
1015/1015 [==============================] - 1s 780us/step - loss: 0.4427 - mean_absolute_error: 0.4951 - mean_squared_error: 0.4427
Epoch 41/250
1015/1015 [==============================] - 1s 765us/step - loss: 0.5002 - mean_absolute_error: 0.5296 - mean_squared_error: 0.5002
Epoch 42/250
1015/1015 [==============================] - 1s 771us/step - loss: 0.4096 - mean_absolute_error: 0.4776 - mean_squared_error: 0.4096
Epoch 43/250
1015/1015 [==============================] - 1s 759us/step - loss: 0.3671 - mean_absolute_error: 0.4403 - mean_squared_error: 0.3671
Epoch 44/250
1015/1015 [==============================] - 1s 767us/step - loss: 0.3868 - mean_absolute_error: 0.4577 - mean_squared_error: 0.3868
Epoch 45/250
1015/1015 [==============================] - 1s 782us/step - loss: 0.4976 - mean_absolute_error: 0.5265 - mean_squared_error: 0.4976
Epoch 46/250
1015/1015 [==============================] - 1s 775us/step - loss: 0.4874 - mean_absolute_error: 0.5081 - mean_squared_error: 0.4874
Epoch 47/250
1015/1015 [==============================] - 1s 774us/step - loss: 0.3836 - mean_absolute_error: 0.4563 - mean_squared_error: 0.3836
Epoch 48/250
1015/1015 [==============================] - 1s 771us/step - loss: 0.3670 - mean_absolute_error: 0.4446 - mean_squared_error: 0.3670
Epoch 49/250
1015/1015 [==============================] - 1s 786us/step - loss: 0.3819 - mean_absolute_error: 0.4499 - mean_squared_error: 0.3819
Epoch 50/250
1015/1015 [==============================] - 1s 769us/step - loss: 0.3700 - mean_absolute_error: 0.4402 - mean_squared_error: 0.3700
Epoch 51/250
1015/1015 [==============================] - 1s 787us/step - loss: 0.4387 - mean_absolute_error: 0.5001 - mean_squared_error: 0.4387
Epoch 52/250
1015/1015 [==============================] - 1s 778us/step - loss: 0.4326 - mean_absolute_error: 0.4888 - mean_squared_error: 0.4326
Epoch 53/250
1015/1015 [==============================] - 1s 776us/step - loss: 0.4296 - mean_absolute_error: 0.4941 - mean_squared_error: 0.4296
Epoch 54/250
1015/1015 [==============================] - 1s 769us/step - loss: 0.3956 - mean_absolute_error: 0.4563 - mean_squared_error: 0.3956
Epoch 55/250
1015/1015 [==============================] - 1s 772us/step - loss: 0.3673 - mean_absolute_error: 0.4420 - mean_squared_error: 0.3673
Epoch 56/250
1015/1015 [==============================] - 1s 769us/step - loss: 0.3754 - mean_absolute_error: 0.4544 - mean_squared_error: 0.3754
Epoch 57/250
1015/1015 [==============================] - 1s 777us/step - loss: 0.3405 - mean_absolute_error: 0.4209 - mean_squared_error: 0.3405
Epoch 58/250
1015/1015 [==============================] - 1s 760us/step - loss: 0.3541 - mean_absolute_error: 0.4336 - mean_squared_error: 0.3541
Epoch 59/250
1015/1015 [==============================] - 1s 783us/step - loss: 0.3773 - mean_absolute_error: 0.4503 - mean_squared_error: 0.3773
Epoch 60/250
1015/1015 [==============================] - 1s 773us/step - loss: 0.3634 - mean_absolute_error: 0.4484 - mean_squared_error: 0.3634
Epoch 61/250
1015/1015 [==============================] - 1s 774us/step - loss: 0.3480 - mean_absolute_error: 0.4330 - mean_squared_error: 0.3480
Epoch 62/250
1015/1015 [==============================] - 1s 777us/step - loss: 0.3524 - mean_absolute_error: 0.4328 - mean_squared_error: 0.3524
Epoch 63/250
1015/1015 [==============================] - 1s 782us/step - loss: 0.3549 - mean_absolute_error: 0.4369 - mean_squared_error: 0.3549
Epoch 64/250
1015/1015 [==============================] - 1s 771us/step - loss: 0.3699 - mean_absolute_error: 0.4385 - mean_squared_error: 0.3699
Epoch 65/250
1015/1015 [==============================] - 1s 769us/step - loss: 0.3231 - mean_absolute_error: 0.4085 - mean_squared_error: 0.3231
Epoch 66/250
1015/1015 [==============================] - 1s 784us/step - loss: 0.3258 - mean_absolute_error: 0.4119 - mean_squared_error: 0.3258
Epoch 67/250
1015/1015 [==============================] - 1s 790us/step - loss: 0.3524 - mean_absolute_error: 0.4352 - mean_squared_error: 0.3524
Epoch 68/250
1015/1015 [==============================] - 1s 781us/step - loss: 0.3766 - mean_absolute_error: 0.4498 - mean_squared_error: 0.3766
Epoch 69/250
1015/1015 [==============================] - 1s 764us/step - loss: 0.3108 - mean_absolute_error: 0.4013 - mean_squared_error: 0.3108
Epoch 70/250
1015/1015 [==============================] - 1s 776us/step - loss: 0.3454 - mean_absolute_error: 0.4237 - mean_squared_error: 0.3454
Epoch 71/250
1015/1015 [==============================] - 1s 765us/step - loss: 0.3473 - mean_absolute_error: 0.4319 - mean_squared_error: 0.3473
Epoch 72/250
1015/1015 [==============================] - 1s 775us/step - loss: 0.3530 - mean_absolute_error: 0.4381 - mean_squared_error: 0.3530
Epoch 73/250
1015/1015 [==============================] - 1s 768us/step - loss: 0.3379 - mean_absolute_error: 0.4228 - mean_squared_error: 0.3379
Epoch 74/250
1015/1015 [==============================] - 1s 776us/step - loss: 0.3176 - mean_absolute_error: 0.4052 - mean_squared_error: 0.3176
Epoch 75/250
1015/1015 [==============================] - 1s 769us/step - loss: 0.3526 - mean_absolute_error: 0.4310 - mean_squared_error: 0.3526
Epoch 76/250
1015/1015 [==============================] - 1s 769us/step - loss: 0.4337 - mean_absolute_error: 0.5009 - mean_squared_error: 0.4337
Epoch 77/250
1015/1015 [==============================] - 1s 774us/step - loss: 0.3926 - mean_absolute_error: 0.4632 - mean_squared_error: 0.3926
Epoch 78/250
1015/1015 [==============================] - 1s 771us/step - loss: 0.3358 - mean_absolute_error: 0.4187 - mean_squared_error: 0.3358
Epoch 79/250
1015/1015 [==============================] - 1s 764us/step - loss: 0.3319 - mean_absolute_error: 0.4225 - mean_squared_error: 0.3319
Epoch 80/250
1015/1015 [==============================] - 1s 771us/step - loss: 0.3310 - mean_absolute_error: 0.4146 - mean_squared_error: 0.3310
Epoch 81/250
1015/1015 [==============================] - 1s 783us/step - loss: 0.3165 - mean_absolute_error: 0.4039 - mean_squared_error: 0.3165
Epoch 82/250
1015/1015 [==============================] - 1s 759us/step - loss: 0.2999 - mean_absolute_error: 0.3907 - mean_squared_error: 0.2999
Epoch 83/250
1015/1015 [==============================] - 1s 779us/step - loss: 0.3089 - mean_absolute_error: 0.4028 - mean_squared_error: 0.3089
Epoch 84/250
1015/1015 [==============================] - 1s 770us/step - loss: 0.3100 - mean_absolute_error: 0.4060 - mean_squared_error: 0.3100
Epoch 85/250
1015/1015 [==============================] - 1s 770us/step - loss: 0.2929 - mean_absolute_error: 0.3897 - mean_squared_error: 0.2929
Epoch 86/250
1015/1015 [==============================] - 1s 768us/step - loss: 0.3502 - mean_absolute_error: 0.4348 - mean_squared_error: 0.3502
Epoch 87/250
1015/1015 [==============================] - 1s 764us/step - loss: 0.3213 - mean_absolute_error: 0.4118 - mean_squared_error: 0.3213
Epoch 88/250
1015/1015 [==============================] - 1s 773us/step - loss: 0.3225 - mean_absolute_error: 0.4163 - mean_squared_error: 0.3225
Epoch 89/250
1015/1015 [==============================] - 1s 769us/step - loss: 0.3025 - mean_absolute_error: 0.4000 - mean_squared_error: 0.3025
Epoch 90/250
1015/1015 [==============================] - 1s 765us/step - loss: 0.3145 - mean_absolute_error: 0.4089 - mean_squared_error: 0.3145
Epoch 91/250
1015/1015 [==============================] - 1s 777us/step - loss: 0.3679 - mean_absolute_error: 0.4486 - mean_squared_error: 0.3679
Epoch 92/250
1015/1015 [==============================] - 1s 768us/step - loss: 0.3280 - mean_absolute_error: 0.4159 - mean_squared_error: 0.3280
Epoch 93/250
1015/1015 [==============================] - 1s 779us/step - loss: 0.2906 - mean_absolute_error: 0.3834 - mean_squared_error: 0.2906
Epoch 94/250
1015/1015 [==============================] - 1s 767us/step - loss: 0.2886 - mean_absolute_error: 0.3827 - mean_squared_error: 0.2886
Epoch 95/250
1015/1015 [==============================] - 1s 777us/step - loss: 0.2905 - mean_absolute_error: 0.3826 - mean_squared_error: 0.2905
Epoch 96/250
1015/1015 [==============================] - 1s 771us/step - loss: 0.3242 - mean_absolute_error: 0.4083 - mean_squared_error: 0.3242
Epoch 97/250
1015/1015 [==============================] - 1s 770us/step - loss: 0.2890 - mean_absolute_error: 0.3852 - mean_squared_error: 0.2890
Epoch 98/250
1015/1015 [==============================] - 1s 752us/step - loss: 0.2790 - mean_absolute_error: 0.3757 - mean_squared_error: 0.2790
Epoch 99/250
1015/1015 [==============================] - 1s 768us/step - loss: 0.2817 - mean_absolute_error: 0.3762 - mean_squared_error: 0.2817
Epoch 100/250
1015/1015 [==============================] - 1s 764us/step - loss: 0.2792 - mean_absolute_error: 0.3738 - mean_squared_error: 0.2792
Epoch 101/250
1015/1015 [==============================] - 1s 763us/step - loss: 0.2748 - mean_absolute_error: 0.3733 - mean_squared_error: 0.2748
Epoch 102/250
1015/1015 [==============================] - 1s 764us/step - loss: 0.2850 - mean_absolute_error: 0.3820 - mean_squared_error: 0.2850
Epoch 103/250
1015/1015 [==============================] - 1s 779us/step - loss: 0.2927 - mean_absolute_error: 0.3851 - mean_squared_error: 0.2927
Epoch 104/250
1015/1015 [==============================] - 1s 772us/step - loss: 0.2930 - mean_absolute_error: 0.3838 - mean_squared_error: 0.2930
Epoch 105/250
1015/1015 [==============================] - 1s 780us/step - loss: 0.2753 - mean_absolute_error: 0.3717 - mean_squared_error: 0.2753
Epoch 106/250
1015/1015 [==============================] - 1s 767us/step - loss: 0.2555 - mean_absolute_error: 0.3549 - mean_squared_error: 0.2555
Epoch 107/250
1015/1015 [==============================] - 1s 756us/step - loss: 0.3249 - mean_absolute_error: 0.4149 - mean_squared_error: 0.3249
Epoch 108/250
1015/1015 [==============================] - 1s 769us/step - loss: 0.2975 - mean_absolute_error: 0.3966 - mean_squared_error: 0.2975
Epoch 109/250
1015/1015 [==============================] - 1s 767us/step - loss: 0.2669 - mean_absolute_error: 0.3667 - mean_squared_error: 0.2669
Epoch 110/250
1015/1015 [==============================] - 1s 782us/step - loss: 0.2549 - mean_absolute_error: 0.3563 - mean_squared_error: 0.2549
Epoch 111/250
1015/1015 [==============================] - 1s 769us/step - loss: 0.2676 - mean_absolute_error: 0.3627 - mean_squared_error: 0.2676
Epoch 112/250
1015/1015 [==============================] - 1s 774us/step - loss: 0.2466 - mean_absolute_error: 0.3481 - mean_squared_error: 0.2466
Epoch 113/250
1015/1015 [==============================] - 1s 775us/step - loss: 0.2810 - mean_absolute_error: 0.3862 - mean_squared_error: 0.2810
Epoch 114/250
1015/1015 [==============================] - 1s 771us/step - loss: 0.2857 - mean_absolute_error: 0.3768 - mean_squared_error: 0.2857
Epoch 115/250
1015/1015 [==============================] - 1s 769us/step - loss: 0.2694 - mean_absolute_error: 0.3734 - mean_squared_error: 0.2694
Epoch 116/250
1015/1015 [==============================] - 1s 767us/step - loss: 0.2540 - mean_absolute_error: 0.3534 - mean_squared_error: 0.2540
Epoch 117/250
1015/1015 [==============================] - 1s 766us/step - loss: 0.2365 - mean_absolute_error: 0.3405 - mean_squared_error: 0.2365
Epoch 118/250
1015/1015 [==============================] - 1s 763us/step - loss: 0.2772 - mean_absolute_error: 0.3737 - mean_squared_error: 0.2772
Epoch 119/250
1015/1015 [==============================] - 1s 759us/step - loss: 0.2936 - mean_absolute_error: 0.3889 - mean_squared_error: 0.2936
Epoch 120/250
1015/1015 [==============================] - 1s 762us/step - loss: 0.2864 - mean_absolute_error: 0.3785 - mean_squared_error: 0.2864
Epoch 121/250
1015/1015 [==============================] - 1s 784us/step - loss: 0.2793 - mean_absolute_error: 0.3804 - mean_squared_error: 0.2793
Epoch 122/250
1015/1015 [==============================] - 1s 759us/step - loss: 0.2616 - mean_absolute_error: 0.3674 - mean_squared_error: 0.2616
Epoch 123/250
1015/1015 [==============================] - 1s 768us/step - loss: 0.2659 - mean_absolute_error: 0.3679 - mean_squared_error: 0.2659
Epoch 124/250
1015/1015 [==============================] - 1s 770us/step - loss: 0.2541 - mean_absolute_error: 0.3585 - mean_squared_error: 0.2541
Epoch 125/250
1015/1015 [==============================] - 1s 764us/step - loss: 0.2687 - mean_absolute_error: 0.3706 - mean_squared_error: 0.2687
Epoch 126/250
1015/1015 [==============================] - 1s 768us/step - loss: 0.2882 - mean_absolute_error: 0.3888 - mean_squared_error: 0.2882
Epoch 127/250
1015/1015 [==============================] - 1s 756us/step - loss: 0.2855 - mean_absolute_error: 0.3894 - mean_squared_error: 0.2855
Epoch 128/250
1015/1015 [==============================] - 1s 770us/step - loss: 0.2343 - mean_absolute_error: 0.3384 - mean_squared_error: 0.2343
Epoch 129/250
1015/1015 [==============================] - 1s 766us/step - loss: 0.2409 - mean_absolute_error: 0.3508 - mean_squared_error: 0.2409
Epoch 130/250
1015/1015 [==============================] - 1s 770us/step - loss: 0.2871 - mean_absolute_error: 0.3772 - mean_squared_error: 0.2871
Epoch 131/250
1015/1015 [==============================] - 1s 768us/step - loss: 0.2564 - mean_absolute_error: 0.3619 - mean_squared_error: 0.2564
Epoch 132/250
1015/1015 [==============================] - 1s 768us/step - loss: 0.2542 - mean_absolute_error: 0.3595 - mean_squared_error: 0.2542
Epoch 133/250
1015/1015 [==============================] - 1s 772us/step - loss: 0.2799 - mean_absolute_error: 0.3865 - mean_squared_error: 0.2799
Epoch 134/250
1015/1015 [==============================] - 1s 767us/step - loss: 0.2265 - mean_absolute_error: 0.3329 - mean_squared_error: 0.2265
Epoch 135/250
1015/1015 [==============================] - 1s 766us/step - loss: 0.2519 - mean_absolute_error: 0.3580 - mean_squared_error: 0.2519
Epoch 136/250
1015/1015 [==============================] - 1s 772us/step - loss: 0.2390 - mean_absolute_error: 0.3485 - mean_squared_error: 0.2390
Epoch 137/250
1015/1015 [==============================] - 1s 767us/step - loss: 0.2961 - mean_absolute_error: 0.3814 - mean_squared_error: 0.2961
Epoch 138/250
1015/1015 [==============================] - 1s 770us/step - loss: 0.2702 - mean_absolute_error: 0.3735 - mean_squared_error: 0.2702
Epoch 139/250
1015/1015 [==============================] - 1s 775us/step - loss: 0.2364 - mean_absolute_error: 0.3438 - mean_squared_error: 0.2364
Epoch 140/250
1015/1015 [==============================] - 1s 763us/step - loss: 0.2341 - mean_absolute_error: 0.3452 - mean_squared_error: 0.2341
Epoch 141/250
1015/1015 [==============================] - 1s 765us/step - loss: 0.2406 - mean_absolute_error: 0.3515 - mean_squared_error: 0.2406
Epoch 142/250
1015/1015 [==============================] - 1s 770us/step - loss: 0.2596 - mean_absolute_error: 0.3650 - mean_squared_error: 0.2596
Epoch 143/250
1015/1015 [==============================] - 1s 770us/step - loss: 0.2558 - mean_absolute_error: 0.3649 - mean_squared_error: 0.2558
Epoch 144/250
1015/1015 [==============================] - 1s 762us/step - loss: 0.2584 - mean_absolute_error: 0.3706 - mean_squared_error: 0.2584
Epoch 145/250
1015/1015 [==============================] - 1s 784us/step - loss: 0.2633 - mean_absolute_error: 0.3634 - mean_squared_error: 0.2633
Epoch 146/250
1015/1015 [==============================] - 1s 760us/step - loss: 0.2466 - mean_absolute_error: 0.3608 - mean_squared_error: 0.2466
Epoch 147/250
1015/1015 [==============================] - 1s 762us/step - loss: 0.2278 - mean_absolute_error: 0.3362 - mean_squared_error: 0.2278
Epoch 148/250
1015/1015 [==============================] - 1s 770us/step - loss: 0.2466 - mean_absolute_error: 0.3569 - mean_squared_error: 0.2466
Epoch 149/250
1015/1015 [==============================] - 1s 761us/step - loss: 0.2229 - mean_absolute_error: 0.3377 - mean_squared_error: 0.2229
Epoch 150/250
1015/1015 [==============================] - 1s 776us/step - loss: 0.2397 - mean_absolute_error: 0.3481 - mean_squared_error: 0.2397
Epoch 151/250
1015/1015 [==============================] - 1s 775us/step - loss: 0.2094 - mean_absolute_error: 0.3248 - mean_squared_error: 0.2094
Epoch 152/250
1015/1015 [==============================] - 1s 766us/step - loss: 0.2582 - mean_absolute_error: 0.3550 - mean_squared_error: 0.2582
Epoch 153/250
1015/1015 [==============================] - 1s 762us/step - loss: 0.2169 - mean_absolute_error: 0.3201 - mean_squared_error: 0.2169
Epoch 154/250
1015/1015 [==============================] - 1s 764us/step - loss: 0.2101 - mean_absolute_error: 0.3261 - mean_squared_error: 0.2101
Epoch 155/250
1015/1015 [==============================] - 1s 774us/step - loss: 0.2199 - mean_absolute_error: 0.3223 - mean_squared_error: 0.2199
Epoch 156/250
1015/1015 [==============================] - 1s 761us/step - loss: 0.2095 - mean_absolute_error: 0.3172 - mean_squared_error: 0.2095
Epoch 157/250
1015/1015 [==============================] - 1s 764us/step - loss: 0.2819 - mean_absolute_error: 0.3764 - mean_squared_error: 0.2819
Epoch 158/250
1015/1015 [==============================] - 1s 782us/step - loss: 0.2470 - mean_absolute_error: 0.3561 - mean_squared_error: 0.2470
Epoch 159/250
1015/1015 [==============================] - 1s 773us/step - loss: 0.2163 - mean_absolute_error: 0.3327 - mean_squared_error: 0.2163
Epoch 160/250
1015/1015 [==============================] - 1s 778us/step - loss: 0.2163 - mean_absolute_error: 0.3318 - mean_squared_error: 0.2163
Epoch 161/250
1015/1015 [==============================] - 1s 784us/step - loss: 0.2030 - mean_absolute_error: 0.3219 - mean_squared_error: 0.2030
Epoch 162/250
1015/1015 [==============================] - 1s 767us/step - loss: 0.2437 - mean_absolute_error: 0.3489 - mean_squared_error: 0.2437
Epoch 163/250
1015/1015 [==============================] - 1s 775us/step - loss: 0.2257 - mean_absolute_error: 0.3388 - mean_squared_error: 0.2257
Epoch 164/250
1015/1015 [==============================] - 1s 778us/step - loss: 0.2731 - mean_absolute_error: 0.3882 - mean_squared_error: 0.2731
Epoch 165/250
1015/1015 [==============================] - 1s 771us/step - loss: 0.1937 - mean_absolute_error: 0.3101 - mean_squared_error: 0.1937
Epoch 166/250
1015/1015 [==============================] - 1s 765us/step - loss: 0.2306 - mean_absolute_error: 0.3358 - mean_squared_error: 0.2306
Epoch 167/250
1015/1015 [==============================] - 1s 771us/step - loss: 0.2268 - mean_absolute_error: 0.3391 - mean_squared_error: 0.2268
Epoch 168/250
1015/1015 [==============================] - 1s 760us/step - loss: 0.2275 - mean_absolute_error: 0.3393 - mean_squared_error: 0.2275
Epoch 169/250
1015/1015 [==============================] - 1s 756us/step - loss: 0.2238 - mean_absolute_error: 0.3433 - mean_squared_error: 0.2238
Epoch 170/250
1015/1015 [==============================] - 1s 773us/step - loss: 0.2176 - mean_absolute_error: 0.3390 - mean_squared_error: 0.2176
Epoch 171/250
1015/1015 [==============================] - 1s 762us/step - loss: 0.2253 - mean_absolute_error: 0.3371 - mean_squared_error: 0.2253
Epoch 172/250
1015/1015 [==============================] - 1s 772us/step - loss: 0.2395 - mean_absolute_error: 0.3427 - mean_squared_error: 0.2395
Epoch 173/250
1015/1015 [==============================] - 1s 774us/step - loss: 0.2136 - mean_absolute_error: 0.3291 - mean_squared_error: 0.2136
Epoch 174/250
1015/1015 [==============================] - 1s 775us/step - loss: 0.2114 - mean_absolute_error: 0.3238 - mean_squared_error: 0.2114
Epoch 175/250
1015/1015 [==============================] - 1s 764us/step - loss: 0.2144 - mean_absolute_error: 0.3304 - mean_squared_error: 0.2144
Epoch 176/250
1015/1015 [==============================] - 1s 768us/step - loss: 0.2173 - mean_absolute_error: 0.3336 - mean_squared_error: 0.2173
Epoch 177/250
1015/1015 [==============================] - 1s 774us/step - loss: 0.2218 - mean_absolute_error: 0.3368 - mean_squared_error: 0.2218
Epoch 178/250
1015/1015 [==============================] - 1s 764us/step - loss: 0.2232 - mean_absolute_error: 0.3422 - mean_squared_error: 0.2232
Epoch 179/250
1015/1015 [==============================] - 1s 779us/step - loss: 0.2271 - mean_absolute_error: 0.3356 - mean_squared_error: 0.2271
Epoch 180/250
1015/1015 [==============================] - 1s 771us/step - loss: 0.2075 - mean_absolute_error: 0.3280 - mean_squared_error: 0.2075
Epoch 181/250
1015/1015 [==============================] - 1s 768us/step - loss: 0.2007 - mean_absolute_error: 0.3179 - mean_squared_error: 0.2007
Epoch 182/250
1015/1015 [==============================] - 1s 760us/step - loss: 0.2392 - mean_absolute_error: 0.3554 - mean_squared_error: 0.2392
Epoch 183/250
1015/1015 [==============================] - 1s 788us/step - loss: 0.1783 - mean_absolute_error: 0.2963 - mean_squared_error: 0.1783
Epoch 184/250
1015/1015 [==============================] - 1s 764us/step - loss: 0.2208 - mean_absolute_error: 0.3345 - mean_squared_error: 0.2208
Epoch 185/250
1015/1015 [==============================] - 1s 771us/step - loss: 0.2104 - mean_absolute_error: 0.3349 - mean_squared_error: 0.2104
Epoch 186/250
1015/1015 [==============================] - 1s 767us/step - loss: 0.1874 - mean_absolute_error: 0.3085 - mean_squared_error: 0.1874
Epoch 187/250
1015/1015 [==============================] - 1s 766us/step - loss: 0.2339 - mean_absolute_error: 0.3468 - mean_squared_error: 0.2339
Epoch 188/250
1015/1015 [==============================] - 1s 768us/step - loss: 0.3134 - mean_absolute_error: 0.3908 - mean_squared_error: 0.3134
Epoch 189/250
1015/1015 [==============================] - 1s 773us/step - loss: 0.2911 - mean_absolute_error: 0.3732 - mean_squared_error: 0.2911
Epoch 190/250
1015/1015 [==============================] - 1s 774us/step - loss: 0.2121 - mean_absolute_error: 0.3265 - mean_squared_error: 0.2121
Epoch 191/250
1015/1015 [==============================] - 1s 780us/step - loss: 0.2108 - mean_absolute_error: 0.3299 - mean_squared_error: 0.2108
Epoch 192/250
1015/1015 [==============================] - 1s 775us/step - loss: 0.2070 - mean_absolute_error: 0.3277 - mean_squared_error: 0.2070
Epoch 193/250
1015/1015 [==============================] - 1s 773us/step - loss: 0.1983 - mean_absolute_error: 0.3173 - mean_squared_error: 0.1983
Epoch 194/250
1015/1015 [==============================] - 1s 774us/step - loss: 0.2409 - mean_absolute_error: 0.3522 - mean_squared_error: 0.2409
Epoch 195/250
1015/1015 [==============================] - 1s 772us/step - loss: 0.2165 - mean_absolute_error: 0.3327 - mean_squared_error: 0.2165
Epoch 196/250
1015/1015 [==============================] - 1s 771us/step - loss: 0.1803 - mean_absolute_error: 0.3060 - mean_squared_error: 0.1803
Epoch 197/250
1015/1015 [==============================] - 1s 784us/step - loss: 0.2122 - mean_absolute_error: 0.3308 - mean_squared_error: 0.2122
Epoch 198/250
1015/1015 [==============================] - 1s 778us/step - loss: 0.2037 - mean_absolute_error: 0.3181 - mean_squared_error: 0.2037
Epoch 199/250
1015/1015 [==============================] - 1s 768us/step - loss: 0.2151 - mean_absolute_error: 0.3420 - mean_squared_error: 0.2151
Epoch 200/250
1015/1015 [==============================] - 1s 768us/step - loss: 0.2290 - mean_absolute_error: 0.3475 - mean_squared_error: 0.2290
Epoch 201/250
1015/1015 [==============================] - 1s 772us/step - loss: 0.1866 - mean_absolute_error: 0.3057 - mean_squared_error: 0.1866
Epoch 202/250
1015/1015 [==============================] - 1s 776us/step - loss: 0.2054 - mean_absolute_error: 0.3238 - mean_squared_error: 0.2054
Epoch 203/250
1015/1015 [==============================] - 1s 767us/step - loss: 0.2249 - mean_absolute_error: 0.3486 - mean_squared_error: 0.2249
Epoch 204/250
1015/1015 [==============================] - 1s 774us/step - loss: 0.2111 - mean_absolute_error: 0.3330 - mean_squared_error: 0.2111
Epoch 205/250
1015/1015 [==============================] - 1s 776us/step - loss: 0.2065 - mean_absolute_error: 0.3258 - mean_squared_error: 0.2065
Epoch 206/250
1015/1015 [==============================] - 1s 763us/step - loss: 0.2254 - mean_absolute_error: 0.3387 - mean_squared_error: 0.2254
Epoch 207/250
1015/1015 [==============================] - 1s 772us/step - loss: 0.1884 - mean_absolute_error: 0.3092 - mean_squared_error: 0.1884
Epoch 208/250
1015/1015 [==============================] - 1s 774us/step - loss: 0.2250 - mean_absolute_error: 0.3306 - mean_squared_error: 0.2250
Epoch 209/250
1015/1015 [==============================] - 1s 773us/step - loss: 0.1932 - mean_absolute_error: 0.3147 - mean_squared_error: 0.1932
Epoch 210/250
1015/1015 [==============================] - 1s 775us/step - loss: 0.1876 - mean_absolute_error: 0.3100 - mean_squared_error: 0.1876
Epoch 211/250
1015/1015 [==============================] - 1s 754us/step - loss: 0.1803 - mean_absolute_error: 0.3008 - mean_squared_error: 0.1803
Epoch 212/250
1015/1015 [==============================] - 1s 766us/step - loss: 0.1700 - mean_absolute_error: 0.2929 - mean_squared_error: 0.1700
Epoch 213/250
1015/1015 [==============================] - 1s 774us/step - loss: 0.1778 - mean_absolute_error: 0.2884 - mean_squared_error: 0.1778
Epoch 214/250
1015/1015 [==============================] - 1s 764us/step - loss: 0.1634 - mean_absolute_error: 0.2836 - mean_squared_error: 0.1634
Epoch 215/250
1015/1015 [==============================] - 1s 776us/step - loss: 0.1788 - mean_absolute_error: 0.2938 - mean_squared_error: 0.1788
Epoch 216/250
1015/1015 [==============================] - 1s 771us/step - loss: 0.2476 - mean_absolute_error: 0.3419 - mean_squared_error: 0.2476
Epoch 217/250
1015/1015 [==============================] - 1s 773us/step - loss: 0.1591 - mean_absolute_error: 0.2801 - mean_squared_error: 0.1591
Epoch 218/250
1015/1015 [==============================] - 1s 769us/step - loss: 0.1825 - mean_absolute_error: 0.3007 - mean_squared_error: 0.1825
Epoch 219/250
1015/1015 [==============================] - 1s 765us/step - loss: 0.1704 - mean_absolute_error: 0.2937 - mean_squared_error: 0.1704
Epoch 220/250
1015/1015 [==============================] - 1s 768us/step - loss: 0.1702 - mean_absolute_error: 0.2906 - mean_squared_error: 0.1702
Epoch 221/250
1015/1015 [==============================] - 1s 787us/step - loss: 0.2419 - mean_absolute_error: 0.3579 - mean_squared_error: 0.2419
Epoch 222/250
1015/1015 [==============================] - 1s 769us/step - loss: 0.1779 - mean_absolute_error: 0.3030 - mean_squared_error: 0.1779
Epoch 223/250
1015/1015 [==============================] - 1s 771us/step - loss: 0.1596 - mean_absolute_error: 0.2769 - mean_squared_error: 0.1596
Epoch 224/250
1015/1015 [==============================] - 1s 765us/step - loss: 0.1459 - mean_absolute_error: 0.2685 - mean_squared_error: 0.1459
Epoch 225/250
1015/1015 [==============================] - 1s 766us/step - loss: 0.1818 - mean_absolute_error: 0.2958 - mean_squared_error: 0.1818
Epoch 226/250
1015/1015 [==============================] - 1s 762us/step - loss: 0.1773 - mean_absolute_error: 0.2927 - mean_squared_error: 0.1773
Epoch 227/250
1015/1015 [==============================] - 1s 768us/step - loss: 0.1859 - mean_absolute_error: 0.3098 - mean_squared_error: 0.1859
Epoch 228/250
1015/1015 [==============================] - 1s 767us/step - loss: 0.1833 - mean_absolute_error: 0.3119 - mean_squared_error: 0.1833
Epoch 229/250
1015/1015 [==============================] - 1s 762us/step - loss: 0.1822 - mean_absolute_error: 0.3023 - mean_squared_error: 0.1822
Epoch 230/250
1015/1015 [==============================] - 1s 764us/step - loss: 0.2243 - mean_absolute_error: 0.3408 - mean_squared_error: 0.2243
Epoch 231/250
1015/1015 [==============================] - 1s 766us/step - loss: 0.1657 - mean_absolute_error: 0.2855 - mean_squared_error: 0.1657
Epoch 232/250
1015/1015 [==============================] - 1s 765us/step - loss: 0.2269 - mean_absolute_error: 0.3334 - mean_squared_error: 0.2269
Epoch 233/250
1015/1015 [==============================] - 1s 781us/step - loss: 0.1838 - mean_absolute_error: 0.3077 - mean_squared_error: 0.1838
Epoch 234/250
1015/1015 [==============================] - 1s 764us/step - loss: 0.1822 - mean_absolute_error: 0.3023 - mean_squared_error: 0.1822
Epoch 235/250
1015/1015 [==============================] - 1s 768us/step - loss: 0.1645 - mean_absolute_error: 0.2847 - mean_squared_error: 0.1645
Epoch 236/250
1015/1015 [==============================] - 1s 772us/step - loss: 0.1753 - mean_absolute_error: 0.2960 - mean_squared_error: 0.1753
Epoch 237/250
1015/1015 [==============================] - 1s 775us/step - loss: 0.1727 - mean_absolute_error: 0.2888 - mean_squared_error: 0.1727
Epoch 238/250
1015/1015 [==============================] - 1s 768us/step - loss: 0.1709 - mean_absolute_error: 0.2930 - mean_squared_error: 0.1709
Epoch 239/250
1015/1015 [==============================] - 1s 761us/step - loss: 0.1771 - mean_absolute_error: 0.2985 - mean_squared_error: 0.1771
Epoch 240/250
1015/1015 [==============================] - 1s 775us/step - loss: 0.1702 - mean_absolute_error: 0.2927 - mean_squared_error: 0.1702
Epoch 241/250
1015/1015 [==============================] - 1s 770us/step - loss: 0.1991 - mean_absolute_error: 0.3288 - mean_squared_error: 0.1991
Epoch 242/250
1015/1015 [==============================] - 1s 775us/step - loss: 0.1795 - mean_absolute_error: 0.2953 - mean_squared_error: 0.1795
Epoch 243/250
1015/1015 [==============================] - 1s 779us/step - loss: 0.1905 - mean_absolute_error: 0.3174 - mean_squared_error: 0.1905
Epoch 244/250
1015/1015 [==============================] - 1s 778us/step - loss: 0.1753 - mean_absolute_error: 0.2913 - mean_squared_error: 0.1753
Epoch 245/250
1015/1015 [==============================] - 1s 762us/step - loss: 0.1829 - mean_absolute_error: 0.2965 - mean_squared_error: 0.1829
Epoch 246/250
1015/1015 [==============================] - 1s 788us/step - loss: 0.1653 - mean_absolute_error: 0.2938 - mean_squared_error: 0.1653
Epoch 247/250
1015/1015 [==============================] - 1s 774us/step - loss: 0.1729 - mean_absolute_error: 0.3055 - mean_squared_error: 0.1729
Epoch 248/250
1015/1015 [==============================] - 1s 776us/step - loss: 0.1522 - mean_absolute_error: 0.2738 - mean_squared_error: 0.1522
Epoch 249/250
1015/1015 [==============================] - 1s 774us/step - loss: 0.1645 - mean_absolute_error: 0.2866 - mean_squared_error: 0.1645
Epoch 250/250
1015/1015 [==============================] - 1s 771us/step - loss: 0.1851 - mean_absolute_error: 0.3059 - mean_squared_error: 0.1851
Out[57]:
<keras.callbacks.History at 0x7f452d2626a0>
In [58]:
labels_estimated_testing_scaled = clf.predict(input_test)

labels_estimated_testing = ScalerNN.inverse_transform(labels_estimated_testing_scaled)

fig, ax = plt.subplots(2, 3, figsize = (18,10))
for i in range(5):
    RMS = np.sqrt(mean_squared_error(labels_estimated_testing[:,i], labels_test[:,i]))
    RMS2 = np.sqrt(mean_squared_error(labels_estimated_testing_scaled[:,i], labels_test_scaled[:,i]))
    MAE = mean_absolute_error(labels_estimated_testing[:,i], labels_test[:,i])
    bias = np.sum(labels_estimated_testing[:,i] - labels_test[:,i])/nr_training_stars
    ax[i//3, i%3].scatter(labels_test[:,i], labels_estimated_testing[:,i], s=4, c="navy")
    ax[i//3, i%3].plot([-100,6000],[-100,6000], c="y")
    ax[i//3, i%3].set_xlim(*label_xlims[i]); ax[i//3, i%3].set_ylim(*label_xlims[i])
    ax[i//3, i%3].set_title(f"{label_names[i]} - Testing Data\nRMSE = {RMS:.4f}  |  RMSE(scaled) = {RMS2:.4f}\nMAE={MAE:.4f}    |    BIAS = {bias:.4f}")
    ax[i//3, i%3].set_xlabel("ASPCAP Input Labels"); ax[i//3, i%3].set_ylabel("Model Predicted Labels")
ax[-1,-1].axis('off')
plt.tight_layout();

6.3 Expanding the Model - Data Augmentation

One of the main disadvantages of the neural network is its inability to make direct use of the provided spectrum errors One way of indirectly doing this, together with creating a larger dataset, is data augmentation.

We will bootstrap an increased dataset of size 10'000, 10 times the size of our original set, sampled from our original dataset. The sampling will be performed by randomly picking stars in our dataset (with a slight bias towards stars with a lower total error), and slightly augmenting their spectrum. This is done independently on each pixel, picking a random value sampled from a gaussian curve with their respective standard errors. This is performed with the numpy.random.normal module.

$$ f' = f + \epsilon, \quad\quad P(\epsilon) = \frac{1}{\sqrt{2\sigma_f^2}} e^{\epsilon^2/\sigma_f^2} $$

One question that arises is how to deal with the "bad" pixels, marked with $10^{10}$ errors. It is obviously unacceptable to leave them as such in the sampling. We found that setting such errors down to 0.05 provided the best results. This is on the very high end of what found naturally in the dataset.

The network layout will be left the same, as we are still trying to fit the same overall model. We have however substantially increased our batch size, to 512, to reflect our increased data size. We have also increased the number of epochs to 1000. This might raise concerns of overfitting, but we have this time let Keras set aside 20% of our dataset as a validation set. Keras can then be asked to only load the epoch in which the MSE of this validation data was at its lowest.

As we can see from the figures below, the results are very promising, closing almost the entire gap the original model had on our neural network. Below this, we have plotted a figure of MSE (green), together with validation data MSE (blue). The 10 lowest validation data MSE are shown as red dots. As we can see, we have not overfitted our data, as the validation MSE is still decreasing, and a longer run might actually further improve our results.

In [0]:
error_aug = err_train.copy()
error_aug[error_aug > 0.1] = 0.05
star_weights = 1/(100 + np.sum(error_aug, axis=1))
star_weights = star_weights/np.sum(star_weights)
aug_input = np.zeros((10000, nr_pixels_reduced))
aug_output = np.zeros((10000, 5))

samples_idx = np.random.choice(np.arange(0, nr_training_stars), size=(10000), p=star_weights)
for i in range(10000):
    chosen_star = samples_idx[i]
    aug_input[i] = input_train[chosen_star] + np.random.normal(loc=0, scale=error_aug[chosen_star], size=nr_pixels_reduced)
    aug_output[i] = output_train_scaled[chosen_star]
In [60]:
opti = optimizers.adam(amsgrad=True)
def nn_clf():
    model = Sequential()
    model.add(Dense(1000, input_shape=(7214,), activation="linear", kernel_initializer='random_uniform'))
    model.add(Dense(2000,  activation="linear", kernel_initializer='random_uniform'))
    model.add(Dense(2000,  activation="linear", kernel_initializer='random_uniform'))
    model.add(Dense(200,  activation="linear", kernel_initializer='random_uniform'))
    model.add(Dense(20, activation="linear", kernel_initializer='random_uniform'))
    model.add(Dense(5, activation="linear", kernel_initializer='random_uniform'))
    model.compile(optimizer=opti, loss='mean_squared_error',  metrics=['mae',"mse"])
    return model

mcp_save = ModelCheckpoint('.mdl_wts.hdf5', save_best_only=True, monitor='val_loss', mode='min')

history = History()
clf = KerasRegressor(build_fn=nn_clf, batch_size=512, epochs=1000, callbacks=[mcp_save, history], validation_split=0.20)
history = clf.fit(aug_input, aug_output)
Train on 8000 samples, validate on 2000 samples
Epoch 1/1000
8000/8000 [==============================] - 3s 348us/step - loss: 41.2039 - mean_absolute_error: 4.3904 - mean_squared_error: 41.2039 - val_loss: 6.3167 - val_mean_absolute_error: 1.9117 - val_mean_squared_error: 6.3167
Epoch 2/1000
8000/8000 [==============================] - 1s 154us/step - loss: 3.0954 - mean_absolute_error: 1.4157 - mean_squared_error: 3.0954 - val_loss: 1.4174 - val_mean_absolute_error: 0.9457 - val_mean_squared_error: 1.4174
Epoch 3/1000
8000/8000 [==============================] - 1s 150us/step - loss: 1.1858 - mean_absolute_error: 0.8603 - mean_squared_error: 1.1858 - val_loss: 1.0695 - val_mean_absolute_error: 0.8252 - val_mean_squared_error: 1.0695
Epoch 4/1000
8000/8000 [==============================] - 1s 149us/step - loss: 1.0040 - mean_absolute_error: 0.7856 - mean_squared_error: 1.0040 - val_loss: 0.9433 - val_mean_absolute_error: 0.7523 - val_mean_squared_error: 0.9433
Epoch 5/1000
8000/8000 [==============================] - 1s 149us/step - loss: 0.9450 - mean_absolute_error: 0.7600 - mean_squared_error: 0.9450 - val_loss: 0.9189 - val_mean_absolute_error: 0.7457 - val_mean_squared_error: 0.9189
Epoch 6/1000
8000/8000 [==============================] - 1s 162us/step - loss: 0.9166 - mean_absolute_error: 0.7478 - mean_squared_error: 0.9166 - val_loss: 0.9279 - val_mean_absolute_error: 0.7762 - val_mean_squared_error: 0.9279
Epoch 7/1000
8000/8000 [==============================] - 1s 146us/step - loss: 0.9142 - mean_absolute_error: 0.7471 - mean_squared_error: 0.9142 - val_loss: 0.8864 - val_mean_absolute_error: 0.7214 - val_mean_squared_error: 0.8864
Epoch 8/1000
8000/8000 [==============================] - 1s 149us/step - loss: 0.9062 - mean_absolute_error: 0.7436 - mean_squared_error: 0.9062 - val_loss: 0.8914 - val_mean_absolute_error: 0.7551 - val_mean_squared_error: 0.8914
Epoch 9/1000
8000/8000 [==============================] - 1s 149us/step - loss: 0.8449 - mean_absolute_error: 0.7164 - mean_squared_error: 0.8449 - val_loss: 0.8204 - val_mean_absolute_error: 0.6934 - val_mean_squared_error: 0.8204
Epoch 10/1000
8000/8000 [==============================] - 1s 148us/step - loss: 1.1614 - mean_absolute_error: 0.8398 - mean_squared_error: 1.1614 - val_loss: 1.1170 - val_mean_absolute_error: 0.8276 - val_mean_squared_error: 1.1170
Epoch 11/1000
8000/8000 [==============================] - 1s 148us/step - loss: 0.9552 - mean_absolute_error: 0.7716 - mean_squared_error: 0.9552 - val_loss: 0.9127 - val_mean_absolute_error: 0.7120 - val_mean_squared_error: 0.9127
Epoch 12/1000
8000/8000 [==============================] - 1s 145us/step - loss: 0.8802 - mean_absolute_error: 0.7336 - mean_squared_error: 0.8802 - val_loss: 0.8454 - val_mean_absolute_error: 0.7058 - val_mean_squared_error: 0.8454
Epoch 13/1000
8000/8000 [==============================] - 1s 143us/step - loss: 0.8322 - mean_absolute_error: 0.7112 - mean_squared_error: 0.8322 - val_loss: 0.7844 - val_mean_absolute_error: 0.6854 - val_mean_squared_error: 0.7844
Epoch 14/1000
8000/8000 [==============================] - 1s 151us/step - loss: 0.7736 - mean_absolute_error: 0.6786 - mean_squared_error: 0.7736 - val_loss: 0.7345 - val_mean_absolute_error: 0.6567 - val_mean_squared_error: 0.7345
Epoch 15/1000
8000/8000 [==============================] - 1s 150us/step - loss: 0.7249 - mean_absolute_error: 0.6489 - mean_squared_error: 0.7249 - val_loss: 0.7273 - val_mean_absolute_error: 0.6428 - val_mean_squared_error: 0.7273
Epoch 16/1000
8000/8000 [==============================] - 1s 148us/step - loss: 0.7405 - mean_absolute_error: 0.6566 - mean_squared_error: 0.7405 - val_loss: 0.9091 - val_mean_absolute_error: 0.7766 - val_mean_squared_error: 0.9091
Epoch 17/1000
8000/8000 [==============================] - 1s 146us/step - loss: 0.7806 - mean_absolute_error: 0.6860 - mean_squared_error: 0.7806 - val_loss: 0.7379 - val_mean_absolute_error: 0.6544 - val_mean_squared_error: 0.7379
Epoch 18/1000
8000/8000 [==============================] - 1s 144us/step - loss: 0.6976 - mean_absolute_error: 0.6312 - mean_squared_error: 0.6976 - val_loss: 0.6855 - val_mean_absolute_error: 0.6306 - val_mean_squared_error: 0.6855
Epoch 19/1000
8000/8000 [==============================] - 1s 149us/step - loss: 0.6832 - mean_absolute_error: 0.6214 - mean_squared_error: 0.6832 - val_loss: 0.8355 - val_mean_absolute_error: 0.7372 - val_mean_squared_error: 0.8355
Epoch 20/1000
8000/8000 [==============================] - 1s 143us/step - loss: 0.7299 - mean_absolute_error: 0.6578 - mean_squared_error: 0.7299 - val_loss: 0.6447 - val_mean_absolute_error: 0.5957 - val_mean_squared_error: 0.6447
Epoch 21/1000
8000/8000 [==============================] - 1s 147us/step - loss: 0.6466 - mean_absolute_error: 0.5920 - mean_squared_error: 0.6466 - val_loss: 0.6205 - val_mean_absolute_error: 0.5733 - val_mean_squared_error: 0.6205
Epoch 22/1000
8000/8000 [==============================] - 1s 149us/step - loss: 0.7029 - mean_absolute_error: 0.6395 - mean_squared_error: 0.7029 - val_loss: 0.7164 - val_mean_absolute_error: 0.6618 - val_mean_squared_error: 0.7164
Epoch 23/1000
8000/8000 [==============================] - 1s 143us/step - loss: 0.6685 - mean_absolute_error: 0.6143 - mean_squared_error: 0.6685 - val_loss: 0.6181 - val_mean_absolute_error: 0.5766 - val_mean_squared_error: 0.6181
Epoch 24/1000
8000/8000 [==============================] - 1s 144us/step - loss: 0.6362 - mean_absolute_error: 0.5896 - mean_squared_error: 0.6362 - val_loss: 0.6039 - val_mean_absolute_error: 0.5661 - val_mean_squared_error: 0.6039
Epoch 25/1000
8000/8000 [==============================] - 1s 149us/step - loss: 0.6277 - mean_absolute_error: 0.5820 - mean_squared_error: 0.6277 - val_loss: 0.6092 - val_mean_absolute_error: 0.5765 - val_mean_squared_error: 0.6092
Epoch 26/1000
8000/8000 [==============================] - 1s 146us/step - loss: 0.6169 - mean_absolute_error: 0.5792 - mean_squared_error: 0.6169 - val_loss: 0.5983 - val_mean_absolute_error: 0.5543 - val_mean_squared_error: 0.5983
Epoch 27/1000
8000/8000 [==============================] - 1s 148us/step - loss: 0.6101 - mean_absolute_error: 0.5724 - mean_squared_error: 0.6101 - val_loss: 0.5822 - val_mean_absolute_error: 0.5544 - val_mean_squared_error: 0.5822
Epoch 28/1000
8000/8000 [==============================] - 1s 158us/step - loss: 0.6227 - mean_absolute_error: 0.5887 - mean_squared_error: 0.6227 - val_loss: 0.6136 - val_mean_absolute_error: 0.6005 - val_mean_squared_error: 0.6136
Epoch 29/1000
8000/8000 [==============================] - 1s 144us/step - loss: 0.6003 - mean_absolute_error: 0.5729 - mean_squared_error: 0.6003 - val_loss: 0.5679 - val_mean_absolute_error: 0.5479 - val_mean_squared_error: 0.5679
Epoch 30/1000
8000/8000 [==============================] - 1s 148us/step - loss: 0.5828 - mean_absolute_error: 0.5592 - mean_squared_error: 0.5828 - val_loss: 0.6421 - val_mean_absolute_error: 0.6276 - val_mean_squared_error: 0.6421
Epoch 31/1000
8000/8000 [==============================] - 1s 143us/step - loss: 0.6050 - mean_absolute_error: 0.5842 - mean_squared_error: 0.6050 - val_loss: 0.5658 - val_mean_absolute_error: 0.5661 - val_mean_squared_error: 0.5658
Epoch 32/1000
8000/8000 [==============================] - 1s 145us/step - loss: 0.5762 - mean_absolute_error: 0.5643 - mean_squared_error: 0.5762 - val_loss: 0.5393 - val_mean_absolute_error: 0.5237 - val_mean_squared_error: 0.5393
Epoch 33/1000
8000/8000 [==============================] - 1s 146us/step - loss: 0.5644 - mean_absolute_error: 0.5587 - mean_squared_error: 0.5644 - val_loss: 0.5498 - val_mean_absolute_error: 0.5710 - val_mean_squared_error: 0.5498
Epoch 34/1000
8000/8000 [==============================] - 1s 143us/step - loss: 0.5985 - mean_absolute_error: 0.5870 - mean_squared_error: 0.5985 - val_loss: 0.5585 - val_mean_absolute_error: 0.5843 - val_mean_squared_error: 0.5585
Epoch 35/1000
8000/8000 [==============================] - 1s 146us/step - loss: 0.5275 - mean_absolute_error: 0.5394 - mean_squared_error: 0.5275 - val_loss: 0.5398 - val_mean_absolute_error: 0.5446 - val_mean_squared_error: 0.5398
Epoch 36/1000
8000/8000 [==============================] - 1s 143us/step - loss: 0.6172 - mean_absolute_error: 0.6008 - mean_squared_error: 0.6172 - val_loss: 0.5671 - val_mean_absolute_error: 0.5487 - val_mean_squared_error: 0.5671
Epoch 37/1000
8000/8000 [==============================] - 1s 143us/step - loss: 0.5150 - mean_absolute_error: 0.5344 - mean_squared_error: 0.5150 - val_loss: 0.4830 - val_mean_absolute_error: 0.5300 - val_mean_squared_error: 0.4830
Epoch 38/1000
8000/8000 [==============================] - 1s 145us/step - loss: 0.5448 - mean_absolute_error: 0.5516 - mean_squared_error: 0.5448 - val_loss: 0.4826 - val_mean_absolute_error: 0.5114 - val_mean_squared_error: 0.4826
Epoch 39/1000
8000/8000 [==============================] - 1s 146us/step - loss: 0.4925 - mean_absolute_error: 0.5209 - mean_squared_error: 0.4925 - val_loss: 0.4760 - val_mean_absolute_error: 0.5232 - val_mean_squared_error: 0.4760
Epoch 40/1000
8000/8000 [==============================] - 1s 145us/step - loss: 0.4537 - mean_absolute_error: 0.4998 - mean_squared_error: 0.4537 - val_loss: 0.4131 - val_mean_absolute_error: 0.4749 - val_mean_squared_error: 0.4131
Epoch 41/1000
8000/8000 [==============================] - 1s 146us/step - loss: 0.5161 - mean_absolute_error: 0.5384 - mean_squared_error: 0.5161 - val_loss: 0.5231 - val_mean_absolute_error: 0.5635 - val_mean_squared_error: 0.5231
Epoch 42/1000
8000/8000 [==============================] - 1s 143us/step - loss: 0.4771 - mean_absolute_error: 0.5170 - mean_squared_error: 0.4771 - val_loss: 0.6027 - val_mean_absolute_error: 0.5557 - val_mean_squared_error: 0.6027
Epoch 43/1000
8000/8000 [==============================] - 1s 144us/step - loss: 0.4503 - mean_absolute_error: 0.4922 - mean_squared_error: 0.4503 - val_loss: 0.3784 - val_mean_absolute_error: 0.4396 - val_mean_squared_error: 0.3784
Epoch 44/1000
8000/8000 [==============================] - 1s 159us/step - loss: 0.3989 - mean_absolute_error: 0.4671 - mean_squared_error: 0.3989 - val_loss: 0.4143 - val_mean_absolute_error: 0.4857 - val_mean_squared_error: 0.4143
Epoch 45/1000
8000/8000 [==============================] - 1s 144us/step - loss: 1.4085 - mean_absolute_error: 0.8783 - mean_squared_error: 1.4085 - val_loss: 0.9724 - val_mean_absolute_error: 0.7301 - val_mean_squared_error: 0.9724
Epoch 46/1000
8000/8000 [==============================] - 1s 145us/step - loss: 0.8244 - mean_absolute_error: 0.6863 - mean_squared_error: 0.8244 - val_loss: 0.7260 - val_mean_absolute_error: 0.6373 - val_mean_squared_error: 0.7260
Epoch 47/1000
8000/8000 [==============================] - 1s 147us/step - loss: 0.6933 - mean_absolute_error: 0.6181 - mean_squared_error: 0.6933 - val_loss: 0.6537 - val_mean_absolute_error: 0.5867 - val_mean_squared_error: 0.6537
Epoch 48/1000
8000/8000 [==============================] - 1s 145us/step - loss: 0.6424 - mean_absolute_error: 0.5873 - mean_squared_error: 0.6424 - val_loss: 0.6301 - val_mean_absolute_error: 0.5869 - val_mean_squared_error: 0.6301
Epoch 49/1000
8000/8000 [==============================] - 1s 145us/step - loss: 0.6109 - mean_absolute_error: 0.5739 - mean_squared_error: 0.6109 - val_loss: 0.5802 - val_mean_absolute_error: 0.5491 - val_mean_squared_error: 0.5802
Epoch 50/1000
8000/8000 [==============================] - 1s 142us/step - loss: 0.5809 - mean_absolute_error: 0.5566 - mean_squared_error: 0.5809 - val_loss: 0.5557 - val_mean_absolute_error: 0.5415 - val_mean_squared_error: 0.5557
Epoch 51/1000
8000/8000 [==============================] - 1s 144us/step - loss: 0.5467 - mean_absolute_error: 0.5394 - mean_squared_error: 0.5467 - val_loss: 0.5196 - val_mean_absolute_error: 0.5247 - val_mean_squared_error: 0.5196
Epoch 52/1000
8000/8000 [==============================] - 1s 145us/step - loss: 0.5128 - mean_absolute_error: 0.5225 - mean_squared_error: 0.5128 - val_loss: 0.4801 - val_mean_absolute_error: 0.5096 - val_mean_squared_error: 0.4801
Epoch 53/1000
8000/8000 [==============================] - 1s 144us/step - loss: 0.4696 - mean_absolute_error: 0.5027 - mean_squared_error: 0.4696 - val_loss: 0.4352 - val_mean_absolute_error: 0.4889 - val_mean_squared_error: 0.4352
Epoch 54/1000
8000/8000 [==============================] - 1s 144us/step - loss: 0.4305 - mean_absolute_error: 0.4839 - mean_squared_error: 0.4305 - val_loss: 0.4234 - val_mean_absolute_error: 0.4579 - val_mean_squared_error: 0.4234
Epoch 55/1000
8000/8000 [==============================] - 1s 143us/step - loss: 0.4170 - mean_absolute_error: 0.4836 - mean_squared_error: 0.4170 - val_loss: 0.4842 - val_mean_absolute_error: 0.5302 - val_mean_squared_error: 0.4842
Epoch 56/1000
8000/8000 [==============================] - 1s 142us/step - loss: 0.3989 - mean_absolute_error: 0.4683 - mean_squared_error: 0.3989 - val_loss: 0.3881 - val_mean_absolute_error: 0.4411 - val_mean_squared_error: 0.3881
Epoch 57/1000
8000/8000 [==============================] - 1s 144us/step - loss: 0.3996 - mean_absolute_error: 0.4708 - mean_squared_error: 0.3996 - val_loss: 0.3315 - val_mean_absolute_error: 0.4115 - val_mean_squared_error: 0.3315
Epoch 58/1000
8000/8000 [==============================] - 1s 147us/step - loss: 0.3991 - mean_absolute_error: 0.4666 - mean_squared_error: 0.3991 - val_loss: 0.3597 - val_mean_absolute_error: 0.4233 - val_mean_squared_error: 0.3597
Epoch 59/1000
8000/8000 [==============================] - 1s 143us/step - loss: 0.3390 - mean_absolute_error: 0.4199 - mean_squared_error: 0.3390 - val_loss: 0.3588 - val_mean_absolute_error: 0.4462 - val_mean_squared_error: 0.3588
Epoch 60/1000
8000/8000 [==============================] - 1s 142us/step - loss: 0.6292 - mean_absolute_error: 0.6019 - mean_squared_error: 0.6292 - val_loss: 0.5285 - val_mean_absolute_error: 0.5504 - val_mean_squared_error: 0.5285
Epoch 61/1000
8000/8000 [==============================] - 1s 142us/step - loss: 0.4553 - mean_absolute_error: 0.4945 - mean_squared_error: 0.4553 - val_loss: 0.4324 - val_mean_absolute_error: 0.4831 - val_mean_squared_error: 0.4324
Epoch 62/1000
8000/8000 [==============================] - 1s 145us/step - loss: 0.4049 - mean_absolute_error: 0.4631 - mean_squared_error: 0.4049 - val_loss: 0.3672 - val_mean_absolute_error: 0.4309 - val_mean_squared_error: 0.3672
Epoch 63/1000
8000/8000 [==============================] - 1s 143us/step - loss: 0.3476 - mean_absolute_error: 0.4270 - mean_squared_error: 0.3476 - val_loss: 0.3247 - val_mean_absolute_error: 0.4026 - val_mean_squared_error: 0.3247
Epoch 64/1000
8000/8000 [==============================] - 1s 149us/step - loss: 0.3352 - mean_absolute_error: 0.4202 - mean_squared_error: 0.3352 - val_loss: 0.3151 - val_mean_absolute_error: 0.3935 - val_mean_squared_error: 0.3151
Epoch 65/1000
8000/8000 [==============================] - 1s 142us/step - loss: 0.3551 - mean_absolute_error: 0.4354 - mean_squared_error: 0.3551 - val_loss: 0.3295 - val_mean_absolute_error: 0.4172 - val_mean_squared_error: 0.3295
Epoch 66/1000
8000/8000 [==============================] - 1s 142us/step - loss: 0.3332 - mean_absolute_error: 0.4180 - mean_squared_error: 0.3332 - val_loss: 0.3117 - val_mean_absolute_error: 0.3985 - val_mean_squared_error: 0.3117
Epoch 67/1000
8000/8000 [==============================] - 1s 144us/step - loss: 0.3215 - mean_absolute_error: 0.4082 - mean_squared_error: 0.3215 - val_loss: 0.3232 - val_mean_absolute_error: 0.4020 - val_mean_squared_error: 0.3232
Epoch 68/1000
8000/8000 [==============================] - 1s 144us/step - loss: 0.3110 - mean_absolute_error: 0.4002 - mean_squared_error: 0.3110 - val_loss: 0.3437 - val_mean_absolute_error: 0.4290 - val_mean_squared_error: 0.3437
Epoch 69/1000
8000/8000 [==============================] - 1s 143us/step - loss: 0.3638 - mean_absolute_error: 0.4437 - mean_squared_error: 0.3638 - val_loss: 0.3187 - val_mean_absolute_error: 0.3972 - val_mean_squared_error: 0.3187
Epoch 70/1000
8000/8000 [==============================] - 1s 141us/step - loss: 0.3327 - mean_absolute_error: 0.4180 - mean_squared_error: 0.3327 - val_loss: 0.3846 - val_mean_absolute_error: 0.4731 - val_mean_squared_error: 0.3846
Epoch 71/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.3581 - mean_absolute_error: 0.4443 - mean_squared_error: 0.3581 - val_loss: 0.3125 - val_mean_absolute_error: 0.3893 - val_mean_squared_error: 0.3125
Epoch 72/1000
8000/8000 [==============================] - 1s 140us/step - loss: 0.3151 - mean_absolute_error: 0.4021 - mean_squared_error: 0.3151 - val_loss: 0.2979 - val_mean_absolute_error: 0.3792 - val_mean_squared_error: 0.2979
Epoch 73/1000
8000/8000 [==============================] - 1s 145us/step - loss: 0.3398 - mean_absolute_error: 0.4233 - mean_squared_error: 0.3398 - val_loss: 0.3048 - val_mean_absolute_error: 0.3939 - val_mean_squared_error: 0.3048
Epoch 74/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.3115 - mean_absolute_error: 0.4008 - mean_squared_error: 0.3115 - val_loss: 0.3211 - val_mean_absolute_error: 0.4080 - val_mean_squared_error: 0.3211
Epoch 75/1000
8000/8000 [==============================] - 1s 140us/step - loss: 0.3293 - mean_absolute_error: 0.4194 - mean_squared_error: 0.3293 - val_loss: 0.3267 - val_mean_absolute_error: 0.4035 - val_mean_squared_error: 0.3267
Epoch 76/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.3323 - mean_absolute_error: 0.4224 - mean_squared_error: 0.3323 - val_loss: 0.3053 - val_mean_absolute_error: 0.3857 - val_mean_squared_error: 0.3053
Epoch 77/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.3062 - mean_absolute_error: 0.3959 - mean_squared_error: 0.3062 - val_loss: 0.2926 - val_mean_absolute_error: 0.3750 - val_mean_squared_error: 0.2926
Epoch 78/1000
8000/8000 [==============================] - 1s 145us/step - loss: 0.2950 - mean_absolute_error: 0.3876 - mean_squared_error: 0.2950 - val_loss: 0.4439 - val_mean_absolute_error: 0.5167 - val_mean_squared_error: 0.4439
Epoch 79/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.3163 - mean_absolute_error: 0.4069 - mean_squared_error: 0.3163 - val_loss: 0.3019 - val_mean_absolute_error: 0.3900 - val_mean_squared_error: 0.3019
Epoch 80/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.3824 - mean_absolute_error: 0.4609 - mean_squared_error: 0.3824 - val_loss: 0.3302 - val_mean_absolute_error: 0.4266 - val_mean_squared_error: 0.3302
Epoch 81/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.3340 - mean_absolute_error: 0.4191 - mean_squared_error: 0.3340 - val_loss: 0.3049 - val_mean_absolute_error: 0.3847 - val_mean_squared_error: 0.3049
Epoch 82/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.3034 - mean_absolute_error: 0.3959 - mean_squared_error: 0.3034 - val_loss: 0.3604 - val_mean_absolute_error: 0.4515 - val_mean_squared_error: 0.3604
Epoch 83/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.3372 - mean_absolute_error: 0.4291 - mean_squared_error: 0.3372 - val_loss: 0.4044 - val_mean_absolute_error: 0.4839 - val_mean_squared_error: 0.4044
Epoch 84/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.3527 - mean_absolute_error: 0.4418 - mean_squared_error: 0.3527 - val_loss: 0.2912 - val_mean_absolute_error: 0.3739 - val_mean_squared_error: 0.2912
Epoch 85/1000
8000/8000 [==============================] - 1s 141us/step - loss: 0.2986 - mean_absolute_error: 0.3926 - mean_squared_error: 0.2986 - val_loss: 0.2953 - val_mean_absolute_error: 0.3835 - val_mean_squared_error: 0.2953
Epoch 86/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.2953 - mean_absolute_error: 0.3905 - mean_squared_error: 0.2953 - val_loss: 0.2855 - val_mean_absolute_error: 0.3747 - val_mean_squared_error: 0.2855
Epoch 87/1000
8000/8000 [==============================] - 1s 141us/step - loss: 0.3378 - mean_absolute_error: 0.4280 - mean_squared_error: 0.3378 - val_loss: 0.3554 - val_mean_absolute_error: 0.4412 - val_mean_squared_error: 0.3554
Epoch 88/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.3234 - mean_absolute_error: 0.4144 - mean_squared_error: 0.3234 - val_loss: 0.3006 - val_mean_absolute_error: 0.3970 - val_mean_squared_error: 0.3006
Epoch 89/1000
8000/8000 [==============================] - 1s 140us/step - loss: 0.2979 - mean_absolute_error: 0.3949 - mean_squared_error: 0.2979 - val_loss: 0.3083 - val_mean_absolute_error: 0.4019 - val_mean_squared_error: 0.3083
Epoch 90/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.3071 - mean_absolute_error: 0.4018 - mean_squared_error: 0.3071 - val_loss: 0.2775 - val_mean_absolute_error: 0.3712 - val_mean_squared_error: 0.2775
Epoch 91/1000
8000/8000 [==============================] - 1s 145us/step - loss: 0.2809 - mean_absolute_error: 0.3788 - mean_squared_error: 0.2809 - val_loss: 0.3340 - val_mean_absolute_error: 0.4276 - val_mean_squared_error: 0.3340
Epoch 92/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.3146 - mean_absolute_error: 0.4085 - mean_squared_error: 0.3146 - val_loss: 0.2822 - val_mean_absolute_error: 0.3698 - val_mean_squared_error: 0.2822
Epoch 93/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.2943 - mean_absolute_error: 0.3893 - mean_squared_error: 0.2943 - val_loss: 0.2793 - val_mean_absolute_error: 0.3762 - val_mean_squared_error: 0.2793
Epoch 94/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.2751 - mean_absolute_error: 0.3736 - mean_squared_error: 0.2751 - val_loss: 0.3694 - val_mean_absolute_error: 0.4681 - val_mean_squared_error: 0.3694
Epoch 95/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.3143 - mean_absolute_error: 0.4118 - mean_squared_error: 0.3143 - val_loss: 0.2723 - val_mean_absolute_error: 0.3570 - val_mean_squared_error: 0.2723
Epoch 96/1000
8000/8000 [==============================] - 1s 140us/step - loss: 0.3001 - mean_absolute_error: 0.3962 - mean_squared_error: 0.3001 - val_loss: 0.2993 - val_mean_absolute_error: 0.3957 - val_mean_squared_error: 0.2993
Epoch 97/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.3004 - mean_absolute_error: 0.4002 - mean_squared_error: 0.3004 - val_loss: 0.3068 - val_mean_absolute_error: 0.4040 - val_mean_squared_error: 0.3068
Epoch 98/1000
8000/8000 [==============================] - 1s 133us/step - loss: 0.2887 - mean_absolute_error: 0.3856 - mean_squared_error: 0.2887 - val_loss: 0.3012 - val_mean_absolute_error: 0.3988 - val_mean_squared_error: 0.3012
Epoch 99/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.2851 - mean_absolute_error: 0.3857 - mean_squared_error: 0.2851 - val_loss: 0.3409 - val_mean_absolute_error: 0.4308 - val_mean_squared_error: 0.3409
Epoch 100/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.3137 - mean_absolute_error: 0.4075 - mean_squared_error: 0.3137 - val_loss: 0.2713 - val_mean_absolute_error: 0.3647 - val_mean_squared_error: 0.2713
Epoch 101/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.2845 - mean_absolute_error: 0.3845 - mean_squared_error: 0.2845 - val_loss: 0.2892 - val_mean_absolute_error: 0.3973 - val_mean_squared_error: 0.2892
Epoch 102/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.2756 - mean_absolute_error: 0.3784 - mean_squared_error: 0.2756 - val_loss: 0.2689 - val_mean_absolute_error: 0.3612 - val_mean_squared_error: 0.2689
Epoch 103/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.2565 - mean_absolute_error: 0.3580 - mean_squared_error: 0.2565 - val_loss: 0.2789 - val_mean_absolute_error: 0.3764 - val_mean_squared_error: 0.2789
Epoch 104/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.3100 - mean_absolute_error: 0.4102 - mean_squared_error: 0.3100 - val_loss: 0.3040 - val_mean_absolute_error: 0.4050 - val_mean_squared_error: 0.3040
Epoch 105/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.2708 - mean_absolute_error: 0.3718 - mean_squared_error: 0.2708 - val_loss: 0.2782 - val_mean_absolute_error: 0.3723 - val_mean_squared_error: 0.2782
Epoch 106/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.2809 - mean_absolute_error: 0.3815 - mean_squared_error: 0.2809 - val_loss: 0.7413 - val_mean_absolute_error: 0.6716 - val_mean_squared_error: 0.7413
Epoch 107/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.3687 - mean_absolute_error: 0.4427 - mean_squared_error: 0.3687 - val_loss: 0.3069 - val_mean_absolute_error: 0.4064 - val_mean_squared_error: 0.3069
Epoch 108/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.2868 - mean_absolute_error: 0.3824 - mean_squared_error: 0.2868 - val_loss: 0.2555 - val_mean_absolute_error: 0.3489 - val_mean_squared_error: 0.2555
Epoch 109/1000
8000/8000 [==============================] - 1s 140us/step - loss: 0.2703 - mean_absolute_error: 0.3732 - mean_squared_error: 0.2703 - val_loss: 0.2956 - val_mean_absolute_error: 0.3951 - val_mean_squared_error: 0.2956
Epoch 110/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.2956 - mean_absolute_error: 0.3956 - mean_squared_error: 0.2956 - val_loss: 0.2708 - val_mean_absolute_error: 0.3720 - val_mean_squared_error: 0.2708
Epoch 111/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.2677 - mean_absolute_error: 0.3724 - mean_squared_error: 0.2677 - val_loss: 0.2882 - val_mean_absolute_error: 0.4031 - val_mean_squared_error: 0.2882
Epoch 112/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.2893 - mean_absolute_error: 0.3932 - mean_squared_error: 0.2893 - val_loss: 0.2649 - val_mean_absolute_error: 0.3533 - val_mean_squared_error: 0.2649
Epoch 113/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.2656 - mean_absolute_error: 0.3711 - mean_squared_error: 0.2656 - val_loss: 0.2716 - val_mean_absolute_error: 0.3565 - val_mean_squared_error: 0.2716
Epoch 114/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.2827 - mean_absolute_error: 0.3855 - mean_squared_error: 0.2827 - val_loss: 0.3253 - val_mean_absolute_error: 0.4248 - val_mean_squared_error: 0.3253
Epoch 115/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.2751 - mean_absolute_error: 0.3759 - mean_squared_error: 0.2751 - val_loss: 0.2442 - val_mean_absolute_error: 0.3454 - val_mean_squared_error: 0.2442
Epoch 116/1000
8000/8000 [==============================] - 1s 141us/step - loss: 0.2396 - mean_absolute_error: 0.3450 - mean_squared_error: 0.2396 - val_loss: 0.2568 - val_mean_absolute_error: 0.3567 - val_mean_squared_error: 0.2568
Epoch 117/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.3125 - mean_absolute_error: 0.4120 - mean_squared_error: 0.3125 - val_loss: 0.2769 - val_mean_absolute_error: 0.3833 - val_mean_squared_error: 0.2769
Epoch 118/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.2628 - mean_absolute_error: 0.3670 - mean_squared_error: 0.2628 - val_loss: 0.2410 - val_mean_absolute_error: 0.3438 - val_mean_squared_error: 0.2410
Epoch 119/1000
8000/8000 [==============================] - 1s 141us/step - loss: 0.2371 - mean_absolute_error: 0.3431 - mean_squared_error: 0.2371 - val_loss: 0.2491 - val_mean_absolute_error: 0.3572 - val_mean_squared_error: 0.2491
Epoch 120/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.3547 - mean_absolute_error: 0.4574 - mean_squared_error: 0.3547 - val_loss: 0.2999 - val_mean_absolute_error: 0.3925 - val_mean_squared_error: 0.2999
Epoch 121/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.2895 - mean_absolute_error: 0.3982 - mean_squared_error: 0.2895 - val_loss: 0.2598 - val_mean_absolute_error: 0.3679 - val_mean_squared_error: 0.2598
Epoch 122/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.2665 - mean_absolute_error: 0.3709 - mean_squared_error: 0.2665 - val_loss: 0.2432 - val_mean_absolute_error: 0.3405 - val_mean_squared_error: 0.2432
Epoch 123/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.2678 - mean_absolute_error: 0.3720 - mean_squared_error: 0.2678 - val_loss: 0.2472 - val_mean_absolute_error: 0.3442 - val_mean_squared_error: 0.2472
Epoch 124/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.3030 - mean_absolute_error: 0.3967 - mean_squared_error: 0.3030 - val_loss: 0.3111 - val_mean_absolute_error: 0.4216 - val_mean_squared_error: 0.3111
Epoch 125/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.2715 - mean_absolute_error: 0.3732 - mean_squared_error: 0.2715 - val_loss: 0.2733 - val_mean_absolute_error: 0.3832 - val_mean_squared_error: 0.2733
Epoch 126/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.2609 - mean_absolute_error: 0.3693 - mean_squared_error: 0.2609 - val_loss: 0.2785 - val_mean_absolute_error: 0.3625 - val_mean_squared_error: 0.2785
Epoch 127/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.2565 - mean_absolute_error: 0.3658 - mean_squared_error: 0.2565 - val_loss: 0.2253 - val_mean_absolute_error: 0.3283 - val_mean_squared_error: 0.2253
Epoch 128/1000
8000/8000 [==============================] - 1s 142us/step - loss: 0.2505 - mean_absolute_error: 0.3625 - mean_squared_error: 0.2505 - val_loss: 0.2795 - val_mean_absolute_error: 0.3948 - val_mean_squared_error: 0.2795
Epoch 129/1000
8000/8000 [==============================] - 1s 135us/step - loss: 0.2331 - mean_absolute_error: 0.3432 - mean_squared_error: 0.2331 - val_loss: 0.2253 - val_mean_absolute_error: 0.3285 - val_mean_squared_error: 0.2253
Epoch 130/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.3116 - mean_absolute_error: 0.4037 - mean_squared_error: 0.3116 - val_loss: 0.3051 - val_mean_absolute_error: 0.4142 - val_mean_squared_error: 0.3051
Epoch 131/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.2887 - mean_absolute_error: 0.3899 - mean_squared_error: 0.2887 - val_loss: 0.2576 - val_mean_absolute_error: 0.3580 - val_mean_squared_error: 0.2576
Epoch 132/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.2429 - mean_absolute_error: 0.3491 - mean_squared_error: 0.2429 - val_loss: 0.2331 - val_mean_absolute_error: 0.3370 - val_mean_squared_error: 0.2331
Epoch 133/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.2741 - mean_absolute_error: 0.3711 - mean_squared_error: 0.2741 - val_loss: 0.2459 - val_mean_absolute_error: 0.3577 - val_mean_squared_error: 0.2459
Epoch 134/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.3069 - mean_absolute_error: 0.4171 - mean_squared_error: 0.3069 - val_loss: 0.2589 - val_mean_absolute_error: 0.3656 - val_mean_squared_error: 0.2589
Epoch 135/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.2634 - mean_absolute_error: 0.3689 - mean_squared_error: 0.2634 - val_loss: 0.2495 - val_mean_absolute_error: 0.3707 - val_mean_squared_error: 0.2495
Epoch 136/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.2488 - mean_absolute_error: 0.3645 - mean_squared_error: 0.2488 - val_loss: 0.2440 - val_mean_absolute_error: 0.3349 - val_mean_squared_error: 0.2440
Epoch 137/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.2444 - mean_absolute_error: 0.3578 - mean_squared_error: 0.2444 - val_loss: 0.2366 - val_mean_absolute_error: 0.3317 - val_mean_squared_error: 0.2366
Epoch 138/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.2393 - mean_absolute_error: 0.3488 - mean_squared_error: 0.2393 - val_loss: 0.2549 - val_mean_absolute_error: 0.3655 - val_mean_squared_error: 0.2549
Epoch 139/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.2440 - mean_absolute_error: 0.3569 - mean_squared_error: 0.2440 - val_loss: 0.2253 - val_mean_absolute_error: 0.3303 - val_mean_squared_error: 0.2253
Epoch 140/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.2382 - mean_absolute_error: 0.3512 - mean_squared_error: 0.2382 - val_loss: 0.2118 - val_mean_absolute_error: 0.3210 - val_mean_squared_error: 0.2118
Epoch 141/1000
8000/8000 [==============================] - 1s 142us/step - loss: 0.2648 - mean_absolute_error: 0.3681 - mean_squared_error: 0.2648 - val_loss: 0.2637 - val_mean_absolute_error: 0.3657 - val_mean_squared_error: 0.2637
Epoch 142/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.2295 - mean_absolute_error: 0.3379 - mean_squared_error: 0.2295 - val_loss: 0.2096 - val_mean_absolute_error: 0.3076 - val_mean_squared_error: 0.2096
Epoch 143/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.3076 - mean_absolute_error: 0.4090 - mean_squared_error: 0.3076 - val_loss: 0.2296 - val_mean_absolute_error: 0.3300 - val_mean_squared_error: 0.2296
Epoch 144/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.2526 - mean_absolute_error: 0.3625 - mean_squared_error: 0.2526 - val_loss: 0.2156 - val_mean_absolute_error: 0.3160 - val_mean_squared_error: 0.2156
Epoch 145/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.2238 - mean_absolute_error: 0.3333 - mean_squared_error: 0.2238 - val_loss: 0.2661 - val_mean_absolute_error: 0.3680 - val_mean_squared_error: 0.2661
Epoch 146/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.2373 - mean_absolute_error: 0.3565 - mean_squared_error: 0.2373 - val_loss: 0.2745 - val_mean_absolute_error: 0.3964 - val_mean_squared_error: 0.2745
Epoch 147/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.2355 - mean_absolute_error: 0.3550 - mean_squared_error: 0.2355 - val_loss: 0.3171 - val_mean_absolute_error: 0.4006 - val_mean_squared_error: 0.3171
Epoch 148/1000
8000/8000 [==============================] - 1s 135us/step - loss: 0.2628 - mean_absolute_error: 0.3601 - mean_squared_error: 0.2628 - val_loss: 0.2133 - val_mean_absolute_error: 0.3300 - val_mean_squared_error: 0.2133
Epoch 149/1000
8000/8000 [==============================] - 1s 132us/step - loss: 0.2232 - mean_absolute_error: 0.3386 - mean_squared_error: 0.2232 - val_loss: 0.2004 - val_mean_absolute_error: 0.3108 - val_mean_squared_error: 0.2004
Epoch 150/1000
8000/8000 [==============================] - 1s 142us/step - loss: 0.2453 - mean_absolute_error: 0.3554 - mean_squared_error: 0.2453 - val_loss: 0.2072 - val_mean_absolute_error: 0.3124 - val_mean_squared_error: 0.2072
Epoch 151/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.2083 - mean_absolute_error: 0.3171 - mean_squared_error: 0.2083 - val_loss: 0.2114 - val_mean_absolute_error: 0.3317 - val_mean_squared_error: 0.2114
Epoch 152/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.2349 - mean_absolute_error: 0.3489 - mean_squared_error: 0.2349 - val_loss: 0.2989 - val_mean_absolute_error: 0.4038 - val_mean_squared_error: 0.2989
Epoch 153/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.2164 - mean_absolute_error: 0.3288 - mean_squared_error: 0.2164 - val_loss: 0.2155 - val_mean_absolute_error: 0.3248 - val_mean_squared_error: 0.2155
Epoch 154/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.2415 - mean_absolute_error: 0.3533 - mean_squared_error: 0.2415 - val_loss: 0.2882 - val_mean_absolute_error: 0.3630 - val_mean_squared_error: 0.2882
Epoch 155/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.2226 - mean_absolute_error: 0.3373 - mean_squared_error: 0.2226 - val_loss: 0.2123 - val_mean_absolute_error: 0.3191 - val_mean_squared_error: 0.2123
Epoch 156/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.2357 - mean_absolute_error: 0.3430 - mean_squared_error: 0.2357 - val_loss: 0.1978 - val_mean_absolute_error: 0.3141 - val_mean_squared_error: 0.1978
Epoch 157/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.2481 - mean_absolute_error: 0.3553 - mean_squared_error: 0.2481 - val_loss: 0.2281 - val_mean_absolute_error: 0.3415 - val_mean_squared_error: 0.2281
Epoch 158/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.2938 - mean_absolute_error: 0.3974 - mean_squared_error: 0.2938 - val_loss: 0.1951 - val_mean_absolute_error: 0.2965 - val_mean_squared_error: 0.1951
Epoch 159/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.2919 - mean_absolute_error: 0.3994 - mean_squared_error: 0.2919 - val_loss: 0.2799 - val_mean_absolute_error: 0.3901 - val_mean_squared_error: 0.2799
Epoch 160/1000
8000/8000 [==============================] - 1s 140us/step - loss: 0.2350 - mean_absolute_error: 0.3492 - mean_squared_error: 0.2350 - val_loss: 0.2190 - val_mean_absolute_error: 0.3331 - val_mean_squared_error: 0.2190
Epoch 161/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.2574 - mean_absolute_error: 0.3550 - mean_squared_error: 0.2574 - val_loss: 0.2961 - val_mean_absolute_error: 0.3874 - val_mean_squared_error: 0.2961
Epoch 162/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.2261 - mean_absolute_error: 0.3296 - mean_squared_error: 0.2261 - val_loss: 0.2359 - val_mean_absolute_error: 0.3405 - val_mean_squared_error: 0.2359
Epoch 163/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.1971 - mean_absolute_error: 0.3115 - mean_squared_error: 0.1971 - val_loss: 0.1940 - val_mean_absolute_error: 0.3017 - val_mean_squared_error: 0.1940
Epoch 164/1000
8000/8000 [==============================] - 1s 143us/step - loss: 0.2184 - mean_absolute_error: 0.3384 - mean_squared_error: 0.2184 - val_loss: 0.2273 - val_mean_absolute_error: 0.3576 - val_mean_squared_error: 0.2273
Epoch 165/1000
8000/8000 [==============================] - 1s 140us/step - loss: 0.1903 - mean_absolute_error: 0.3077 - mean_squared_error: 0.1903 - val_loss: 0.2258 - val_mean_absolute_error: 0.3259 - val_mean_squared_error: 0.2258
Epoch 166/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.2272 - mean_absolute_error: 0.3360 - mean_squared_error: 0.2272 - val_loss: 0.1913 - val_mean_absolute_error: 0.3043 - val_mean_squared_error: 0.1913
Epoch 167/1000
8000/8000 [==============================] - 1s 142us/step - loss: 0.1941 - mean_absolute_error: 0.3139 - mean_squared_error: 0.1941 - val_loss: 0.1947 - val_mean_absolute_error: 0.3191 - val_mean_squared_error: 0.1947
Epoch 168/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.4272 - mean_absolute_error: 0.4476 - mean_squared_error: 0.4272 - val_loss: 0.3204 - val_mean_absolute_error: 0.3810 - val_mean_squared_error: 0.3204
Epoch 169/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.2536 - mean_absolute_error: 0.3554 - mean_squared_error: 0.2536 - val_loss: 0.2085 - val_mean_absolute_error: 0.3079 - val_mean_squared_error: 0.2085
Epoch 170/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.2175 - mean_absolute_error: 0.3283 - mean_squared_error: 0.2175 - val_loss: 0.6191 - val_mean_absolute_error: 0.6535 - val_mean_squared_error: 0.6191
Epoch 171/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.5842 - mean_absolute_error: 0.5782 - mean_squared_error: 0.5842 - val_loss: 0.4110 - val_mean_absolute_error: 0.4916 - val_mean_squared_error: 0.4110
Epoch 172/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.3433 - mean_absolute_error: 0.4236 - mean_squared_error: 0.3433 - val_loss: 0.2943 - val_mean_absolute_error: 0.3930 - val_mean_squared_error: 0.2943
Epoch 173/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.2444 - mean_absolute_error: 0.3531 - mean_squared_error: 0.2444 - val_loss: 0.2079 - val_mean_absolute_error: 0.3177 - val_mean_squared_error: 0.2079
Epoch 174/1000
8000/8000 [==============================] - 1s 140us/step - loss: 0.1960 - mean_absolute_error: 0.3128 - mean_squared_error: 0.1960 - val_loss: 0.2053 - val_mean_absolute_error: 0.3209 - val_mean_squared_error: 0.2053
Epoch 175/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.2306 - mean_absolute_error: 0.3527 - mean_squared_error: 0.2306 - val_loss: 0.2620 - val_mean_absolute_error: 0.3822 - val_mean_squared_error: 0.2620
Epoch 176/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.2060 - mean_absolute_error: 0.3210 - mean_squared_error: 0.2060 - val_loss: 0.2117 - val_mean_absolute_error: 0.3112 - val_mean_squared_error: 0.2117
Epoch 177/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.2059 - mean_absolute_error: 0.3258 - mean_squared_error: 0.2059 - val_loss: 0.2284 - val_mean_absolute_error: 0.3431 - val_mean_squared_error: 0.2284
Epoch 178/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.2015 - mean_absolute_error: 0.3166 - mean_squared_error: 0.2015 - val_loss: 0.1800 - val_mean_absolute_error: 0.2845 - val_mean_squared_error: 0.1800
Epoch 179/1000
8000/8000 [==============================] - 1s 141us/step - loss: 0.1826 - mean_absolute_error: 0.2978 - mean_squared_error: 0.1826 - val_loss: 0.1755 - val_mean_absolute_error: 0.2825 - val_mean_squared_error: 0.1755
Epoch 180/1000
8000/8000 [==============================] - 1s 144us/step - loss: 0.2087 - mean_absolute_error: 0.3217 - mean_squared_error: 0.2087 - val_loss: 0.2524 - val_mean_absolute_error: 0.3509 - val_mean_squared_error: 0.2524
Epoch 181/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.2018 - mean_absolute_error: 0.3153 - mean_squared_error: 0.2018 - val_loss: 0.2025 - val_mean_absolute_error: 0.3163 - val_mean_squared_error: 0.2025
Epoch 182/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.1889 - mean_absolute_error: 0.3066 - mean_squared_error: 0.1889 - val_loss: 0.1825 - val_mean_absolute_error: 0.3065 - val_mean_squared_error: 0.1825
Epoch 183/1000
8000/8000 [==============================] - 1s 140us/step - loss: 0.2173 - mean_absolute_error: 0.3245 - mean_squared_error: 0.2173 - val_loss: 0.2667 - val_mean_absolute_error: 0.3673 - val_mean_squared_error: 0.2667
Epoch 184/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.2314 - mean_absolute_error: 0.3409 - mean_squared_error: 0.2314 - val_loss: 0.2079 - val_mean_absolute_error: 0.2971 - val_mean_squared_error: 0.2079
Epoch 185/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.1810 - mean_absolute_error: 0.2964 - mean_squared_error: 0.1810 - val_loss: 0.1707 - val_mean_absolute_error: 0.2772 - val_mean_squared_error: 0.1707
Epoch 186/1000
8000/8000 [==============================] - 1s 143us/step - loss: 0.1776 - mean_absolute_error: 0.2918 - mean_squared_error: 0.1776 - val_loss: 0.2588 - val_mean_absolute_error: 0.3615 - val_mean_squared_error: 0.2588
Epoch 187/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.2975 - mean_absolute_error: 0.4077 - mean_squared_error: 0.2975 - val_loss: 0.2573 - val_mean_absolute_error: 0.3554 - val_mean_squared_error: 0.2573
Epoch 188/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.1963 - mean_absolute_error: 0.3109 - mean_squared_error: 0.1963 - val_loss: 0.1734 - val_mean_absolute_error: 0.2878 - val_mean_squared_error: 0.1734
Epoch 189/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.1706 - mean_absolute_error: 0.2902 - mean_squared_error: 0.1706 - val_loss: 0.1805 - val_mean_absolute_error: 0.2894 - val_mean_squared_error: 0.1805
Epoch 190/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.3192 - mean_absolute_error: 0.4145 - mean_squared_error: 0.3192 - val_loss: 0.2352 - val_mean_absolute_error: 0.3637 - val_mean_squared_error: 0.2352
Epoch 191/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.2119 - mean_absolute_error: 0.3261 - mean_squared_error: 0.2119 - val_loss: 0.1771 - val_mean_absolute_error: 0.2863 - val_mean_squared_error: 0.1771
Epoch 192/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.1851 - mean_absolute_error: 0.3011 - mean_squared_error: 0.1851 - val_loss: 0.1937 - val_mean_absolute_error: 0.3107 - val_mean_squared_error: 0.1937
Epoch 193/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.1880 - mean_absolute_error: 0.3034 - mean_squared_error: 0.1880 - val_loss: 0.1746 - val_mean_absolute_error: 0.2989 - val_mean_squared_error: 0.1746
Epoch 194/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.1789 - mean_absolute_error: 0.3029 - mean_squared_error: 0.1789 - val_loss: 0.2059 - val_mean_absolute_error: 0.3072 - val_mean_squared_error: 0.2059
Epoch 195/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.1881 - mean_absolute_error: 0.3045 - mean_squared_error: 0.1881 - val_loss: 0.2026 - val_mean_absolute_error: 0.3138 - val_mean_squared_error: 0.2026
Epoch 196/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.2805 - mean_absolute_error: 0.3856 - mean_squared_error: 0.2805 - val_loss: 0.2438 - val_mean_absolute_error: 0.3491 - val_mean_squared_error: 0.2438
Epoch 197/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.1909 - mean_absolute_error: 0.3045 - mean_squared_error: 0.1909 - val_loss: 0.1674 - val_mean_absolute_error: 0.2837 - val_mean_squared_error: 0.1674
Epoch 198/1000
8000/8000 [==============================] - 1s 141us/step - loss: 0.1678 - mean_absolute_error: 0.2892 - mean_squared_error: 0.1678 - val_loss: 0.1996 - val_mean_absolute_error: 0.3095 - val_mean_squared_error: 0.1996
Epoch 199/1000
8000/8000 [==============================] - 1s 133us/step - loss: 0.2299 - mean_absolute_error: 0.3460 - mean_squared_error: 0.2299 - val_loss: 0.1975 - val_mean_absolute_error: 0.3059 - val_mean_squared_error: 0.1975
Epoch 200/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.1808 - mean_absolute_error: 0.2968 - mean_squared_error: 0.1808 - val_loss: 0.1609 - val_mean_absolute_error: 0.2696 - val_mean_squared_error: 0.1609
Epoch 201/1000
8000/8000 [==============================] - 1s 143us/step - loss: 0.1911 - mean_absolute_error: 0.3017 - mean_squared_error: 0.1911 - val_loss: 0.1858 - val_mean_absolute_error: 0.2957 - val_mean_squared_error: 0.1858
Epoch 202/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.2190 - mean_absolute_error: 0.3414 - mean_squared_error: 0.2190 - val_loss: 0.1737 - val_mean_absolute_error: 0.2785 - val_mean_squared_error: 0.1737
Epoch 203/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.1914 - mean_absolute_error: 0.3057 - mean_squared_error: 0.1914 - val_loss: 0.2158 - val_mean_absolute_error: 0.3343 - val_mean_squared_error: 0.2158
Epoch 204/1000
8000/8000 [==============================] - 1s 140us/step - loss: 0.1756 - mean_absolute_error: 0.3014 - mean_squared_error: 0.1756 - val_loss: 0.1637 - val_mean_absolute_error: 0.2752 - val_mean_squared_error: 0.1637
Epoch 205/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.2433 - mean_absolute_error: 0.3489 - mean_squared_error: 0.2433 - val_loss: 0.1979 - val_mean_absolute_error: 0.3224 - val_mean_squared_error: 0.1979
Epoch 206/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.1697 - mean_absolute_error: 0.2835 - mean_squared_error: 0.1697 - val_loss: 0.1799 - val_mean_absolute_error: 0.2960 - val_mean_squared_error: 0.1799
Epoch 207/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.1584 - mean_absolute_error: 0.2779 - mean_squared_error: 0.1584 - val_loss: 0.1617 - val_mean_absolute_error: 0.2727 - val_mean_squared_error: 0.1617
Epoch 208/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.5094 - mean_absolute_error: 0.4606 - mean_squared_error: 0.5094 - val_loss: 0.4923 - val_mean_absolute_error: 0.4758 - val_mean_squared_error: 0.4923
Epoch 209/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.3753 - mean_absolute_error: 0.4271 - mean_squared_error: 0.3753 - val_loss: 0.3191 - val_mean_absolute_error: 0.3530 - val_mean_squared_error: 0.3191
Epoch 210/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.2823 - mean_absolute_error: 0.3586 - mean_squared_error: 0.2823 - val_loss: 0.2453 - val_mean_absolute_error: 0.3359 - val_mean_squared_error: 0.2453
Epoch 211/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.2447 - mean_absolute_error: 0.3564 - mean_squared_error: 0.2447 - val_loss: 0.3873 - val_mean_absolute_error: 0.4808 - val_mean_squared_error: 0.3873
Epoch 212/1000
8000/8000 [==============================] - 1s 140us/step - loss: 0.2497 - mean_absolute_error: 0.3668 - mean_squared_error: 0.2497 - val_loss: 0.2810 - val_mean_absolute_error: 0.3987 - val_mean_squared_error: 0.2810
Epoch 213/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.2199 - mean_absolute_error: 0.3426 - mean_squared_error: 0.2199 - val_loss: 0.1940 - val_mean_absolute_error: 0.3089 - val_mean_squared_error: 0.1940
Epoch 214/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.1949 - mean_absolute_error: 0.3134 - mean_squared_error: 0.1949 - val_loss: 0.1883 - val_mean_absolute_error: 0.2961 - val_mean_squared_error: 0.1883
Epoch 215/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.2062 - mean_absolute_error: 0.3321 - mean_squared_error: 0.2062 - val_loss: 0.1856 - val_mean_absolute_error: 0.3023 - val_mean_squared_error: 0.1856
Epoch 216/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.1829 - mean_absolute_error: 0.3026 - mean_squared_error: 0.1829 - val_loss: 0.1853 - val_mean_absolute_error: 0.2934 - val_mean_squared_error: 0.1853
Epoch 217/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.1763 - mean_absolute_error: 0.2981 - mean_squared_error: 0.1763 - val_loss: 0.1706 - val_mean_absolute_error: 0.2808 - val_mean_squared_error: 0.1706
Epoch 218/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.2231 - mean_absolute_error: 0.3419 - mean_squared_error: 0.2231 - val_loss: 0.2308 - val_mean_absolute_error: 0.3705 - val_mean_squared_error: 0.2308
Epoch 219/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.2071 - mean_absolute_error: 0.3324 - mean_squared_error: 0.2071 - val_loss: 0.2037 - val_mean_absolute_error: 0.3324 - val_mean_squared_error: 0.2037
Epoch 220/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.1781 - mean_absolute_error: 0.3009 - mean_squared_error: 0.1781 - val_loss: 0.1729 - val_mean_absolute_error: 0.2810 - val_mean_squared_error: 0.1729
Epoch 221/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.1697 - mean_absolute_error: 0.2935 - mean_squared_error: 0.1697 - val_loss: 0.1861 - val_mean_absolute_error: 0.3046 - val_mean_squared_error: 0.1861
Epoch 222/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.1845 - mean_absolute_error: 0.3098 - mean_squared_error: 0.1845 - val_loss: 0.1823 - val_mean_absolute_error: 0.3114 - val_mean_squared_error: 0.1823
Epoch 223/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.1558 - mean_absolute_error: 0.2769 - mean_squared_error: 0.1558 - val_loss: 0.2085 - val_mean_absolute_error: 0.3146 - val_mean_squared_error: 0.2085
Epoch 224/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.1961 - mean_absolute_error: 0.3141 - mean_squared_error: 0.1961 - val_loss: 0.2161 - val_mean_absolute_error: 0.3270 - val_mean_squared_error: 0.2161
Epoch 225/1000
8000/8000 [==============================] - 1s 135us/step - loss: 0.1789 - mean_absolute_error: 0.2932 - mean_squared_error: 0.1789 - val_loss: 0.1505 - val_mean_absolute_error: 0.2659 - val_mean_squared_error: 0.1505
Epoch 226/1000
8000/8000 [==============================] - 1s 143us/step - loss: 0.1542 - mean_absolute_error: 0.2729 - mean_squared_error: 0.1542 - val_loss: 0.1905 - val_mean_absolute_error: 0.3036 - val_mean_squared_error: 0.1905
Epoch 227/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.2015 - mean_absolute_error: 0.3188 - mean_squared_error: 0.2015 - val_loss: 0.1704 - val_mean_absolute_error: 0.2858 - val_mean_squared_error: 0.1704
Epoch 228/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.2211 - mean_absolute_error: 0.3286 - mean_squared_error: 0.2211 - val_loss: 0.2762 - val_mean_absolute_error: 0.3606 - val_mean_squared_error: 0.2762
Epoch 229/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.1873 - mean_absolute_error: 0.3060 - mean_squared_error: 0.1873 - val_loss: 0.1720 - val_mean_absolute_error: 0.2929 - val_mean_squared_error: 0.1720
Epoch 230/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.1640 - mean_absolute_error: 0.2889 - mean_squared_error: 0.1640 - val_loss: 0.1954 - val_mean_absolute_error: 0.3435 - val_mean_squared_error: 0.1954
Epoch 231/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.2009 - mean_absolute_error: 0.3334 - mean_squared_error: 0.2009 - val_loss: 0.1549 - val_mean_absolute_error: 0.2601 - val_mean_squared_error: 0.1549
Epoch 232/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.1630 - mean_absolute_error: 0.2854 - mean_squared_error: 0.1630 - val_loss: 0.1695 - val_mean_absolute_error: 0.2947 - val_mean_squared_error: 0.1695
Epoch 233/1000
8000/8000 [==============================] - 1s 134us/step - loss: 0.1913 - mean_absolute_error: 0.3136 - mean_squared_error: 0.1913 - val_loss: 0.1716 - val_mean_absolute_error: 0.2866 - val_mean_squared_error: 0.1716
Epoch 234/1000
8000/8000 [==============================] - 1s 135us/step - loss: 0.1915 - mean_absolute_error: 0.3121 - mean_squared_error: 0.1915 - val_loss: 0.1480 - val_mean_absolute_error: 0.2610 - val_mean_squared_error: 0.1480
Epoch 235/1000
8000/8000 [==============================] - 1s 135us/step - loss: 0.1445 - mean_absolute_error: 0.2643 - mean_squared_error: 0.1445 - val_loss: 0.2440 - val_mean_absolute_error: 0.3817 - val_mean_squared_error: 0.2440
Epoch 236/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.1948 - mean_absolute_error: 0.3124 - mean_squared_error: 0.1948 - val_loss: 0.1444 - val_mean_absolute_error: 0.2541 - val_mean_squared_error: 0.1444
Epoch 237/1000
8000/8000 [==============================] - 1s 140us/step - loss: 0.1490 - mean_absolute_error: 0.2655 - mean_squared_error: 0.1490 - val_loss: 0.1694 - val_mean_absolute_error: 0.2896 - val_mean_squared_error: 0.1694
Epoch 238/1000
8000/8000 [==============================] - 1s 135us/step - loss: 0.1560 - mean_absolute_error: 0.2730 - mean_squared_error: 0.1560 - val_loss: 0.1490 - val_mean_absolute_error: 0.2692 - val_mean_squared_error: 0.1490
Epoch 239/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.3640 - mean_absolute_error: 0.4120 - mean_squared_error: 0.3640 - val_loss: 0.3173 - val_mean_absolute_error: 0.4599 - val_mean_squared_error: 0.3173
Epoch 240/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.2477 - mean_absolute_error: 0.3518 - mean_squared_error: 0.2477 - val_loss: 0.2031 - val_mean_absolute_error: 0.3042 - val_mean_squared_error: 0.2031
Epoch 241/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.1751 - mean_absolute_error: 0.2941 - mean_squared_error: 0.1751 - val_loss: 0.1664 - val_mean_absolute_error: 0.2897 - val_mean_squared_error: 0.1664
Epoch 242/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.1686 - mean_absolute_error: 0.2987 - mean_squared_error: 0.1686 - val_loss: 0.1516 - val_mean_absolute_error: 0.2650 - val_mean_squared_error: 0.1516
Epoch 243/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.1534 - mean_absolute_error: 0.2789 - mean_squared_error: 0.1534 - val_loss: 0.1661 - val_mean_absolute_error: 0.2885 - val_mean_squared_error: 0.1661
Epoch 244/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.1522 - mean_absolute_error: 0.2775 - mean_squared_error: 0.1522 - val_loss: 0.1700 - val_mean_absolute_error: 0.3075 - val_mean_squared_error: 0.1700
Epoch 245/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.1505 - mean_absolute_error: 0.2746 - mean_squared_error: 0.1505 - val_loss: 0.2826 - val_mean_absolute_error: 0.4359 - val_mean_squared_error: 0.2826
Epoch 246/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.1683 - mean_absolute_error: 0.2953 - mean_squared_error: 0.1683 - val_loss: 0.1693 - val_mean_absolute_error: 0.2822 - val_mean_squared_error: 0.1693
Epoch 247/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.1951 - mean_absolute_error: 0.3151 - mean_squared_error: 0.1951 - val_loss: 0.1710 - val_mean_absolute_error: 0.2977 - val_mean_squared_error: 0.1710
Epoch 248/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.1669 - mean_absolute_error: 0.2855 - mean_squared_error: 0.1669 - val_loss: 0.1655 - val_mean_absolute_error: 0.2963 - val_mean_squared_error: 0.1655
Epoch 249/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.1887 - mean_absolute_error: 0.3063 - mean_squared_error: 0.1887 - val_loss: 0.1592 - val_mean_absolute_error: 0.2644 - val_mean_squared_error: 0.1592
Epoch 250/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.1862 - mean_absolute_error: 0.3005 - mean_squared_error: 0.1862 - val_loss: 0.1660 - val_mean_absolute_error: 0.2658 - val_mean_squared_error: 0.1660
Epoch 251/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.1572 - mean_absolute_error: 0.2780 - mean_squared_error: 0.1572 - val_loss: 0.1501 - val_mean_absolute_error: 0.2813 - val_mean_squared_error: 0.1501
Epoch 252/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.1584 - mean_absolute_error: 0.2819 - mean_squared_error: 0.1584 - val_loss: 0.1707 - val_mean_absolute_error: 0.3052 - val_mean_squared_error: 0.1707
Epoch 253/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.1642 - mean_absolute_error: 0.2872 - mean_squared_error: 0.1642 - val_loss: 0.1544 - val_mean_absolute_error: 0.2618 - val_mean_squared_error: 0.1544
Epoch 254/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.1740 - mean_absolute_error: 0.2860 - mean_squared_error: 0.1740 - val_loss: 0.1465 - val_mean_absolute_error: 0.2559 - val_mean_squared_error: 0.1465
Epoch 255/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.1722 - mean_absolute_error: 0.2863 - mean_squared_error: 0.1722 - val_loss: 0.1377 - val_mean_absolute_error: 0.2525 - val_mean_squared_error: 0.1377
Epoch 256/1000
8000/8000 [==============================] - 1s 144us/step - loss: 0.1795 - mean_absolute_error: 0.2962 - mean_squared_error: 0.1795 - val_loss: 0.8134 - val_mean_absolute_error: 0.7807 - val_mean_squared_error: 0.8134
Epoch 257/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.3430 - mean_absolute_error: 0.4412 - mean_squared_error: 0.3430 - val_loss: 0.2764 - val_mean_absolute_error: 0.3858 - val_mean_squared_error: 0.2764
Epoch 258/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.2006 - mean_absolute_error: 0.3183 - mean_squared_error: 0.2006 - val_loss: 0.1548 - val_mean_absolute_error: 0.2722 - val_mean_squared_error: 0.1548
Epoch 259/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.1633 - mean_absolute_error: 0.2807 - mean_squared_error: 0.1633 - val_loss: 0.1759 - val_mean_absolute_error: 0.2912 - val_mean_squared_error: 0.1759
Epoch 260/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.1584 - mean_absolute_error: 0.2910 - mean_squared_error: 0.1584 - val_loss: 0.1360 - val_mean_absolute_error: 0.2500 - val_mean_squared_error: 0.1360
Epoch 261/1000
8000/8000 [==============================] - 1s 144us/step - loss: 0.1571 - mean_absolute_error: 0.2742 - mean_squared_error: 0.1571 - val_loss: 0.1335 - val_mean_absolute_error: 0.2434 - val_mean_squared_error: 0.1335
Epoch 262/1000
8000/8000 [==============================] - 1s 143us/step - loss: 0.1519 - mean_absolute_error: 0.2692 - mean_squared_error: 0.1519 - val_loss: 0.1544 - val_mean_absolute_error: 0.2718 - val_mean_squared_error: 0.1544
Epoch 263/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.1600 - mean_absolute_error: 0.2793 - mean_squared_error: 0.1600 - val_loss: 0.1395 - val_mean_absolute_error: 0.2605 - val_mean_squared_error: 0.1395
Epoch 264/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.1492 - mean_absolute_error: 0.2714 - mean_squared_error: 0.1492 - val_loss: 0.2269 - val_mean_absolute_error: 0.3304 - val_mean_squared_error: 0.2269
Epoch 265/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.1591 - mean_absolute_error: 0.2733 - mean_squared_error: 0.1591 - val_loss: 0.1935 - val_mean_absolute_error: 0.2909 - val_mean_squared_error: 0.1935
Epoch 266/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.1574 - mean_absolute_error: 0.2723 - mean_squared_error: 0.1574 - val_loss: 0.1374 - val_mean_absolute_error: 0.2483 - val_mean_squared_error: 0.1374
Epoch 267/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.1699 - mean_absolute_error: 0.2842 - mean_squared_error: 0.1699 - val_loss: 0.1573 - val_mean_absolute_error: 0.2733 - val_mean_squared_error: 0.1573
Epoch 268/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.1446 - mean_absolute_error: 0.2621 - mean_squared_error: 0.1446 - val_loss: 0.2170 - val_mean_absolute_error: 0.3243 - val_mean_squared_error: 0.2170
Epoch 269/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.1788 - mean_absolute_error: 0.2904 - mean_squared_error: 0.1788 - val_loss: 0.1349 - val_mean_absolute_error: 0.2543 - val_mean_squared_error: 0.1349
Epoch 270/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.1407 - mean_absolute_error: 0.2666 - mean_squared_error: 0.1407 - val_loss: 0.1315 - val_mean_absolute_error: 0.2469 - val_mean_squared_error: 0.1315
Epoch 271/1000
8000/8000 [==============================] - 1s 144us/step - loss: 0.1554 - mean_absolute_error: 0.2699 - mean_squared_error: 0.1554 - val_loss: 0.1275 - val_mean_absolute_error: 0.2383 - val_mean_squared_error: 0.1275
Epoch 272/1000
8000/8000 [==============================] - 1s 140us/step - loss: 0.1753 - mean_absolute_error: 0.3041 - mean_squared_error: 0.1753 - val_loss: 0.3497 - val_mean_absolute_error: 0.4519 - val_mean_squared_error: 0.3497
Epoch 273/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.3683 - mean_absolute_error: 0.4528 - mean_squared_error: 0.3683 - val_loss: 0.2858 - val_mean_absolute_error: 0.4438 - val_mean_squared_error: 0.2858
Epoch 274/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.1963 - mean_absolute_error: 0.3348 - mean_squared_error: 0.1963 - val_loss: 0.1849 - val_mean_absolute_error: 0.3095 - val_mean_squared_error: 0.1849
Epoch 275/1000
8000/8000 [==============================] - 1s 141us/step - loss: 0.1747 - mean_absolute_error: 0.3044 - mean_squared_error: 0.1747 - val_loss: 0.2954 - val_mean_absolute_error: 0.4298 - val_mean_squared_error: 0.2954
Epoch 276/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.1762 - mean_absolute_error: 0.3031 - mean_squared_error: 0.1762 - val_loss: 0.1460 - val_mean_absolute_error: 0.2615 - val_mean_squared_error: 0.1460
Epoch 277/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.1431 - mean_absolute_error: 0.2685 - mean_squared_error: 0.1431 - val_loss: 0.1764 - val_mean_absolute_error: 0.3048 - val_mean_squared_error: 0.1764
Epoch 278/1000
8000/8000 [==============================] - 1s 140us/step - loss: 0.2200 - mean_absolute_error: 0.3371 - mean_squared_error: 0.2200 - val_loss: 0.2150 - val_mean_absolute_error: 0.3312 - val_mean_squared_error: 0.2150
Epoch 279/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.1498 - mean_absolute_error: 0.2672 - mean_squared_error: 0.1498 - val_loss: 0.1419 - val_mean_absolute_error: 0.2596 - val_mean_squared_error: 0.1419
Epoch 280/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.1309 - mean_absolute_error: 0.2531 - mean_squared_error: 0.1309 - val_loss: 0.2899 - val_mean_absolute_error: 0.4126 - val_mean_squared_error: 0.2899
Epoch 281/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.2554 - mean_absolute_error: 0.3701 - mean_squared_error: 0.2554 - val_loss: 0.1840 - val_mean_absolute_error: 0.2912 - val_mean_squared_error: 0.1840
Epoch 282/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.1450 - mean_absolute_error: 0.2631 - mean_squared_error: 0.1450 - val_loss: 0.1442 - val_mean_absolute_error: 0.2621 - val_mean_squared_error: 0.1442
Epoch 283/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.1328 - mean_absolute_error: 0.2539 - mean_squared_error: 0.1328 - val_loss: 0.2212 - val_mean_absolute_error: 0.3315 - val_mean_squared_error: 0.2212
Epoch 284/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.2168 - mean_absolute_error: 0.3351 - mean_squared_error: 0.2168 - val_loss: 0.1883 - val_mean_absolute_error: 0.3330 - val_mean_squared_error: 0.1883
Epoch 285/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.1529 - mean_absolute_error: 0.2758 - mean_squared_error: 0.1529 - val_loss: 0.1312 - val_mean_absolute_error: 0.2458 - val_mean_squared_error: 0.1312
Epoch 286/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.1335 - mean_absolute_error: 0.2514 - mean_squared_error: 0.1335 - val_loss: 0.1382 - val_mean_absolute_error: 0.2534 - val_mean_squared_error: 0.1382
Epoch 287/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.1338 - mean_absolute_error: 0.2565 - mean_squared_error: 0.1338 - val_loss: 0.1384 - val_mean_absolute_error: 0.2512 - val_mean_squared_error: 0.1384
Epoch 288/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.1393 - mean_absolute_error: 0.2637 - mean_squared_error: 0.1393 - val_loss: 0.1346 - val_mean_absolute_error: 0.2684 - val_mean_squared_error: 0.1346
Epoch 289/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.1930 - mean_absolute_error: 0.2884 - mean_squared_error: 0.1930 - val_loss: 1.3320 - val_mean_absolute_error: 0.7969 - val_mean_squared_error: 1.3320
Epoch 290/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.3977 - mean_absolute_error: 0.4107 - mean_squared_error: 0.3977 - val_loss: 0.2122 - val_mean_absolute_error: 0.2806 - val_mean_squared_error: 0.2122
Epoch 291/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.1969 - mean_absolute_error: 0.2908 - mean_squared_error: 0.1969 - val_loss: 0.1604 - val_mean_absolute_error: 0.2616 - val_mean_squared_error: 0.1604
Epoch 292/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.1595 - mean_absolute_error: 0.2800 - mean_squared_error: 0.1595 - val_loss: 0.1457 - val_mean_absolute_error: 0.2576 - val_mean_squared_error: 0.1457
Epoch 293/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.1388 - mean_absolute_error: 0.2594 - mean_squared_error: 0.1388 - val_loss: 0.1401 - val_mean_absolute_error: 0.2553 - val_mean_squared_error: 0.1401
Epoch 294/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.1678 - mean_absolute_error: 0.3034 - mean_squared_error: 0.1678 - val_loss: 0.1349 - val_mean_absolute_error: 0.2469 - val_mean_squared_error: 0.1349
Epoch 295/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.1311 - mean_absolute_error: 0.2511 - mean_squared_error: 0.1311 - val_loss: 0.1422 - val_mean_absolute_error: 0.2700 - val_mean_squared_error: 0.1422
Epoch 296/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.1361 - mean_absolute_error: 0.2601 - mean_squared_error: 0.1361 - val_loss: 0.1484 - val_mean_absolute_error: 0.2683 - val_mean_squared_error: 0.1484
Epoch 297/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.1749 - mean_absolute_error: 0.3142 - mean_squared_error: 0.1749 - val_loss: 0.1760 - val_mean_absolute_error: 0.3106 - val_mean_squared_error: 0.1760
Epoch 298/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.1341 - mean_absolute_error: 0.2612 - mean_squared_error: 0.1341 - val_loss: 0.1947 - val_mean_absolute_error: 0.3209 - val_mean_squared_error: 0.1947
Epoch 299/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.1414 - mean_absolute_error: 0.2663 - mean_squared_error: 0.1414 - val_loss: 0.1221 - val_mean_absolute_error: 0.2320 - val_mean_squared_error: 0.1221
Epoch 300/1000
8000/8000 [==============================] - 1s 142us/step - loss: 0.1470 - mean_absolute_error: 0.2712 - mean_squared_error: 0.1470 - val_loss: 0.2140 - val_mean_absolute_error: 0.3443 - val_mean_squared_error: 0.2140
Epoch 301/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.1411 - mean_absolute_error: 0.2656 - mean_squared_error: 0.1411 - val_loss: 0.1284 - val_mean_absolute_error: 0.2497 - val_mean_squared_error: 0.1284
Epoch 302/1000
8000/8000 [==============================] - 1s 135us/step - loss: 0.1524 - mean_absolute_error: 0.2724 - mean_squared_error: 0.1524 - val_loss: 0.1296 - val_mean_absolute_error: 0.2580 - val_mean_squared_error: 0.1296
Epoch 303/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.1338 - mean_absolute_error: 0.2542 - mean_squared_error: 0.1338 - val_loss: 0.1805 - val_mean_absolute_error: 0.3021 - val_mean_squared_error: 0.1805
Epoch 304/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.1420 - mean_absolute_error: 0.2661 - mean_squared_error: 0.1420 - val_loss: 0.1285 - val_mean_absolute_error: 0.2364 - val_mean_squared_error: 0.1285
Epoch 305/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.1283 - mean_absolute_error: 0.2456 - mean_squared_error: 0.1283 - val_loss: 0.1140 - val_mean_absolute_error: 0.2287 - val_mean_squared_error: 0.1140
Epoch 306/1000
8000/8000 [==============================] - 1s 140us/step - loss: 0.2296 - mean_absolute_error: 0.3113 - mean_squared_error: 0.2296 - val_loss: 1.3825 - val_mean_absolute_error: 0.8653 - val_mean_squared_error: 1.3825
Epoch 307/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.4120 - mean_absolute_error: 0.4402 - mean_squared_error: 0.4120 - val_loss: 0.2745 - val_mean_absolute_error: 0.3550 - val_mean_squared_error: 0.2745
Epoch 308/1000
8000/8000 [==============================] - 1s 135us/step - loss: 0.2145 - mean_absolute_error: 0.3149 - mean_squared_error: 0.2145 - val_loss: 0.1691 - val_mean_absolute_error: 0.2770 - val_mean_squared_error: 0.1691
Epoch 309/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.1465 - mean_absolute_error: 0.2650 - mean_squared_error: 0.1465 - val_loss: 0.1357 - val_mean_absolute_error: 0.2551 - val_mean_squared_error: 0.1357
Epoch 310/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.1398 - mean_absolute_error: 0.2694 - mean_squared_error: 0.1398 - val_loss: 0.1408 - val_mean_absolute_error: 0.2700 - val_mean_squared_error: 0.1408
Epoch 311/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.1290 - mean_absolute_error: 0.2546 - mean_squared_error: 0.1290 - val_loss: 0.1242 - val_mean_absolute_error: 0.2321 - val_mean_squared_error: 0.1242
Epoch 312/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.1231 - mean_absolute_error: 0.2457 - mean_squared_error: 0.1231 - val_loss: 0.1299 - val_mean_absolute_error: 0.2534 - val_mean_squared_error: 0.1299
Epoch 313/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.1145 - mean_absolute_error: 0.2338 - mean_squared_error: 0.1145 - val_loss: 0.1575 - val_mean_absolute_error: 0.2790 - val_mean_squared_error: 0.1575
Epoch 314/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.1510 - mean_absolute_error: 0.2776 - mean_squared_error: 0.1510 - val_loss: 0.2181 - val_mean_absolute_error: 0.3450 - val_mean_squared_error: 0.2181
Epoch 315/1000
8000/8000 [==============================] - 1s 140us/step - loss: 0.1605 - mean_absolute_error: 0.2894 - mean_squared_error: 0.1605 - val_loss: 0.1506 - val_mean_absolute_error: 0.2723 - val_mean_squared_error: 0.1506
Epoch 316/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.1595 - mean_absolute_error: 0.2940 - mean_squared_error: 0.1595 - val_loss: 0.1760 - val_mean_absolute_error: 0.2945 - val_mean_squared_error: 0.1760
Epoch 317/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.1395 - mean_absolute_error: 0.2706 - mean_squared_error: 0.1395 - val_loss: 0.1708 - val_mean_absolute_error: 0.2885 - val_mean_squared_error: 0.1708
Epoch 318/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.1692 - mean_absolute_error: 0.2861 - mean_squared_error: 0.1692 - val_loss: 0.1229 - val_mean_absolute_error: 0.2305 - val_mean_squared_error: 0.1229
Epoch 319/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.1242 - mean_absolute_error: 0.2418 - mean_squared_error: 0.1242 - val_loss: 0.1296 - val_mean_absolute_error: 0.2430 - val_mean_squared_error: 0.1296
Epoch 320/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.1131 - mean_absolute_error: 0.2343 - mean_squared_error: 0.1131 - val_loss: 0.1634 - val_mean_absolute_error: 0.2880 - val_mean_squared_error: 0.1634
Epoch 321/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.2980 - mean_absolute_error: 0.3753 - mean_squared_error: 0.2980 - val_loss: 0.1748 - val_mean_absolute_error: 0.2646 - val_mean_squared_error: 0.1748
Epoch 322/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.1540 - mean_absolute_error: 0.2598 - mean_squared_error: 0.1540 - val_loss: 0.1336 - val_mean_absolute_error: 0.2409 - val_mean_squared_error: 0.1336
Epoch 323/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.1216 - mean_absolute_error: 0.2395 - mean_squared_error: 0.1216 - val_loss: 0.1223 - val_mean_absolute_error: 0.2347 - val_mean_squared_error: 0.1223
Epoch 324/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.1377 - mean_absolute_error: 0.2682 - mean_squared_error: 0.1377 - val_loss: 0.1475 - val_mean_absolute_error: 0.2848 - val_mean_squared_error: 0.1475
Epoch 325/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.1194 - mean_absolute_error: 0.2468 - mean_squared_error: 0.1194 - val_loss: 0.1201 - val_mean_absolute_error: 0.2344 - val_mean_squared_error: 0.1201
Epoch 326/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.1141 - mean_absolute_error: 0.2370 - mean_squared_error: 0.1141 - val_loss: 0.1342 - val_mean_absolute_error: 0.2703 - val_mean_squared_error: 0.1342
Epoch 327/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.1283 - mean_absolute_error: 0.2593 - mean_squared_error: 0.1283 - val_loss: 0.2306 - val_mean_absolute_error: 0.3730 - val_mean_squared_error: 0.2306
Epoch 328/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.1528 - mean_absolute_error: 0.2825 - mean_squared_error: 0.1528 - val_loss: 0.2154 - val_mean_absolute_error: 0.3197 - val_mean_squared_error: 0.2154
Epoch 329/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.1525 - mean_absolute_error: 0.2690 - mean_squared_error: 0.1525 - val_loss: 0.1955 - val_mean_absolute_error: 0.3126 - val_mean_squared_error: 0.1955
Epoch 330/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.1662 - mean_absolute_error: 0.2875 - mean_squared_error: 0.1662 - val_loss: 0.5106 - val_mean_absolute_error: 0.4961 - val_mean_squared_error: 0.5106
Epoch 331/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.2252 - mean_absolute_error: 0.3230 - mean_squared_error: 0.2252 - val_loss: 0.1492 - val_mean_absolute_error: 0.2520 - val_mean_squared_error: 0.1492
Epoch 332/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.1390 - mean_absolute_error: 0.2650 - mean_squared_error: 0.1390 - val_loss: 0.1247 - val_mean_absolute_error: 0.2398 - val_mean_squared_error: 0.1247
Epoch 333/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.1187 - mean_absolute_error: 0.2428 - mean_squared_error: 0.1187 - val_loss: 0.1249 - val_mean_absolute_error: 0.2505 - val_mean_squared_error: 0.1249
Epoch 334/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.1171 - mean_absolute_error: 0.2440 - mean_squared_error: 0.1171 - val_loss: 0.2165 - val_mean_absolute_error: 0.3448 - val_mean_squared_error: 0.2165
Epoch 335/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.1649 - mean_absolute_error: 0.3059 - mean_squared_error: 0.1649 - val_loss: 0.1319 - val_mean_absolute_error: 0.2471 - val_mean_squared_error: 0.1319
Epoch 336/1000
8000/8000 [==============================] - 1s 135us/step - loss: 0.1263 - mean_absolute_error: 0.2521 - mean_squared_error: 0.1263 - val_loss: 0.1204 - val_mean_absolute_error: 0.2334 - val_mean_squared_error: 0.1204
Epoch 337/1000
8000/8000 [==============================] - 1s 135us/step - loss: 0.1336 - mean_absolute_error: 0.2528 - mean_squared_error: 0.1336 - val_loss: 0.1714 - val_mean_absolute_error: 0.3068 - val_mean_squared_error: 0.1714
Epoch 338/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.1863 - mean_absolute_error: 0.3109 - mean_squared_error: 0.1863 - val_loss: 0.3461 - val_mean_absolute_error: 0.4723 - val_mean_squared_error: 0.3461
Epoch 339/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.1735 - mean_absolute_error: 0.3130 - mean_squared_error: 0.1735 - val_loss: 0.1344 - val_mean_absolute_error: 0.2520 - val_mean_squared_error: 0.1344
Epoch 340/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.1125 - mean_absolute_error: 0.2339 - mean_squared_error: 0.1125 - val_loss: 0.1640 - val_mean_absolute_error: 0.2872 - val_mean_squared_error: 0.1640
Epoch 341/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.1525 - mean_absolute_error: 0.2845 - mean_squared_error: 0.1525 - val_loss: 0.1082 - val_mean_absolute_error: 0.2186 - val_mean_squared_error: 0.1082
Epoch 342/1000
8000/8000 [==============================] - 1s 143us/step - loss: 0.1148 - mean_absolute_error: 0.2403 - mean_squared_error: 0.1148 - val_loss: 0.1324 - val_mean_absolute_error: 0.2529 - val_mean_squared_error: 0.1324
Epoch 343/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.1188 - mean_absolute_error: 0.2398 - mean_squared_error: 0.1188 - val_loss: 0.1292 - val_mean_absolute_error: 0.2475 - val_mean_squared_error: 0.1292
Epoch 344/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.1746 - mean_absolute_error: 0.2913 - mean_squared_error: 0.1746 - val_loss: 0.1504 - val_mean_absolute_error: 0.2739 - val_mean_squared_error: 0.1504
Epoch 345/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.1280 - mean_absolute_error: 0.2491 - mean_squared_error: 0.1280 - val_loss: 0.1219 - val_mean_absolute_error: 0.2335 - val_mean_squared_error: 0.1219
Epoch 346/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.1100 - mean_absolute_error: 0.2327 - mean_squared_error: 0.1100 - val_loss: 0.1465 - val_mean_absolute_error: 0.2577 - val_mean_squared_error: 0.1465
Epoch 347/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.3426 - mean_absolute_error: 0.4073 - mean_squared_error: 0.3426 - val_loss: 0.1988 - val_mean_absolute_error: 0.3041 - val_mean_squared_error: 0.1988
Epoch 348/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.1729 - mean_absolute_error: 0.2783 - mean_squared_error: 0.1729 - val_loss: 0.1643 - val_mean_absolute_error: 0.2666 - val_mean_squared_error: 0.1643
Epoch 349/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.1268 - mean_absolute_error: 0.2420 - mean_squared_error: 0.1268 - val_loss: 0.1286 - val_mean_absolute_error: 0.2483 - val_mean_squared_error: 0.1286
Epoch 350/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.1196 - mean_absolute_error: 0.2447 - mean_squared_error: 0.1196 - val_loss: 0.1509 - val_mean_absolute_error: 0.2811 - val_mean_squared_error: 0.1509
Epoch 351/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.1202 - mean_absolute_error: 0.2455 - mean_squared_error: 0.1202 - val_loss: 0.1167 - val_mean_absolute_error: 0.2300 - val_mean_squared_error: 0.1167
Epoch 352/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.1441 - mean_absolute_error: 0.2655 - mean_squared_error: 0.1441 - val_loss: 0.1817 - val_mean_absolute_error: 0.3192 - val_mean_squared_error: 0.1817
Epoch 353/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.1223 - mean_absolute_error: 0.2497 - mean_squared_error: 0.1223 - val_loss: 0.1437 - val_mean_absolute_error: 0.2707 - val_mean_squared_error: 0.1437
Epoch 354/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.1360 - mean_absolute_error: 0.2596 - mean_squared_error: 0.1360 - val_loss: 0.1687 - val_mean_absolute_error: 0.2926 - val_mean_squared_error: 0.1687
Epoch 355/1000
8000/8000 [==============================] - 1s 140us/step - loss: 0.1230 - mean_absolute_error: 0.2449 - mean_squared_error: 0.1230 - val_loss: 0.1937 - val_mean_absolute_error: 0.3233 - val_mean_squared_error: 0.1937
Epoch 356/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.1449 - mean_absolute_error: 0.2747 - mean_squared_error: 0.1449 - val_loss: 0.1108 - val_mean_absolute_error: 0.2200 - val_mean_squared_error: 0.1108
Epoch 357/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.1316 - mean_absolute_error: 0.2612 - mean_squared_error: 0.1316 - val_loss: 0.1359 - val_mean_absolute_error: 0.2659 - val_mean_squared_error: 0.1359
Epoch 358/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.1366 - mean_absolute_error: 0.2653 - mean_squared_error: 0.1366 - val_loss: 0.1277 - val_mean_absolute_error: 0.2630 - val_mean_squared_error: 0.1277
Epoch 359/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.1090 - mean_absolute_error: 0.2294 - mean_squared_error: 0.1090 - val_loss: 0.1760 - val_mean_absolute_error: 0.3049 - val_mean_squared_error: 0.1760
Epoch 360/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.1417 - mean_absolute_error: 0.2637 - mean_squared_error: 0.1417 - val_loss: 0.2095 - val_mean_absolute_error: 0.3352 - val_mean_squared_error: 0.2095
Epoch 361/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.1542 - mean_absolute_error: 0.2817 - mean_squared_error: 0.1542 - val_loss: 0.1564 - val_mean_absolute_error: 0.2786 - val_mean_squared_error: 0.1564
Epoch 362/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.1188 - mean_absolute_error: 0.2460 - mean_squared_error: 0.1188 - val_loss: 0.1245 - val_mean_absolute_error: 0.2456 - val_mean_squared_error: 0.1245
Epoch 363/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.1303 - mean_absolute_error: 0.2544 - mean_squared_error: 0.1303 - val_loss: 0.1777 - val_mean_absolute_error: 0.3107 - val_mean_squared_error: 0.1777
Epoch 364/1000
8000/8000 [==============================] - 1s 135us/step - loss: 0.1276 - mean_absolute_error: 0.2461 - mean_squared_error: 0.1276 - val_loss: 0.1307 - val_mean_absolute_error: 0.2528 - val_mean_squared_error: 0.1307
Epoch 365/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.1092 - mean_absolute_error: 0.2317 - mean_squared_error: 0.1092 - val_loss: 0.1456 - val_mean_absolute_error: 0.2787 - val_mean_squared_error: 0.1456
Epoch 366/1000
8000/8000 [==============================] - 1s 135us/step - loss: 0.1078 - mean_absolute_error: 0.2346 - mean_squared_error: 0.1078 - val_loss: 0.2884 - val_mean_absolute_error: 0.3693 - val_mean_squared_error: 0.2884
Epoch 367/1000
8000/8000 [==============================] - 1s 135us/step - loss: 0.4864 - mean_absolute_error: 0.4687 - mean_squared_error: 0.4864 - val_loss: 0.2362 - val_mean_absolute_error: 0.2810 - val_mean_squared_error: 0.2362
Epoch 368/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.2143 - mean_absolute_error: 0.2959 - mean_squared_error: 0.2143 - val_loss: 0.2101 - val_mean_absolute_error: 0.3068 - val_mean_squared_error: 0.2101
Epoch 369/1000
8000/8000 [==============================] - 1s 140us/step - loss: 0.1651 - mean_absolute_error: 0.2736 - mean_squared_error: 0.1651 - val_loss: 0.1311 - val_mean_absolute_error: 0.2382 - val_mean_squared_error: 0.1311
Epoch 370/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.1282 - mean_absolute_error: 0.2490 - mean_squared_error: 0.1282 - val_loss: 0.1321 - val_mean_absolute_error: 0.2458 - val_mean_squared_error: 0.1321
Epoch 371/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.1219 - mean_absolute_error: 0.2479 - mean_squared_error: 0.1219 - val_loss: 0.1275 - val_mean_absolute_error: 0.2528 - val_mean_squared_error: 0.1275
Epoch 372/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.1333 - mean_absolute_error: 0.2723 - mean_squared_error: 0.1333 - val_loss: 0.1382 - val_mean_absolute_error: 0.2645 - val_mean_squared_error: 0.1382
Epoch 373/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.1268 - mean_absolute_error: 0.2566 - mean_squared_error: 0.1268 - val_loss: 0.1162 - val_mean_absolute_error: 0.2269 - val_mean_squared_error: 0.1162
Epoch 374/1000
8000/8000 [==============================] - 1s 140us/step - loss: 0.1159 - mean_absolute_error: 0.2397 - mean_squared_error: 0.1159 - val_loss: 0.1151 - val_mean_absolute_error: 0.2326 - val_mean_squared_error: 0.1151
Epoch 375/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.1029 - mean_absolute_error: 0.2254 - mean_squared_error: 0.1029 - val_loss: 0.1191 - val_mean_absolute_error: 0.2418 - val_mean_squared_error: 0.1191
Epoch 376/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0994 - mean_absolute_error: 0.2214 - mean_squared_error: 0.0994 - val_loss: 0.1511 - val_mean_absolute_error: 0.2652 - val_mean_squared_error: 0.1511
Epoch 377/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.1400 - mean_absolute_error: 0.2723 - mean_squared_error: 0.1400 - val_loss: 0.1371 - val_mean_absolute_error: 0.2495 - val_mean_squared_error: 0.1371
Epoch 378/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.1264 - mean_absolute_error: 0.2485 - mean_squared_error: 0.1264 - val_loss: 0.1193 - val_mean_absolute_error: 0.2463 - val_mean_squared_error: 0.1193
Epoch 379/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.1298 - mean_absolute_error: 0.2573 - mean_squared_error: 0.1298 - val_loss: 0.1361 - val_mean_absolute_error: 0.2752 - val_mean_squared_error: 0.1361
Epoch 380/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.1025 - mean_absolute_error: 0.2273 - mean_squared_error: 0.1025 - val_loss: 0.1400 - val_mean_absolute_error: 0.2555 - val_mean_squared_error: 0.1400
Epoch 381/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.2346 - mean_absolute_error: 0.3338 - mean_squared_error: 0.2346 - val_loss: 0.2767 - val_mean_absolute_error: 0.4283 - val_mean_squared_error: 0.2767
Epoch 382/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.1742 - mean_absolute_error: 0.3032 - mean_squared_error: 0.1742 - val_loss: 0.1697 - val_mean_absolute_error: 0.2872 - val_mean_squared_error: 0.1697
Epoch 383/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.1440 - mean_absolute_error: 0.2687 - mean_squared_error: 0.1440 - val_loss: 0.1452 - val_mean_absolute_error: 0.2776 - val_mean_squared_error: 0.1452
Epoch 384/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.1402 - mean_absolute_error: 0.2698 - mean_squared_error: 0.1402 - val_loss: 0.1476 - val_mean_absolute_error: 0.2821 - val_mean_squared_error: 0.1476
Epoch 385/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.1258 - mean_absolute_error: 0.2541 - mean_squared_error: 0.1258 - val_loss: 0.1074 - val_mean_absolute_error: 0.2180 - val_mean_squared_error: 0.1074
Epoch 386/1000
8000/8000 [==============================] - 1s 145us/step - loss: 0.1137 - mean_absolute_error: 0.2404 - mean_squared_error: 0.1137 - val_loss: 0.1099 - val_mean_absolute_error: 0.2196 - val_mean_squared_error: 0.1099
Epoch 387/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.1042 - mean_absolute_error: 0.2275 - mean_squared_error: 0.1042 - val_loss: 0.1042 - val_mean_absolute_error: 0.2183 - val_mean_squared_error: 0.1042
Epoch 388/1000
8000/8000 [==============================] - 1s 142us/step - loss: 0.1228 - mean_absolute_error: 0.2470 - mean_squared_error: 0.1228 - val_loss: 0.0992 - val_mean_absolute_error: 0.2088 - val_mean_squared_error: 0.0992
Epoch 389/1000
8000/8000 [==============================] - 1s 145us/step - loss: 0.1018 - mean_absolute_error: 0.2225 - mean_squared_error: 0.1018 - val_loss: 0.1056 - val_mean_absolute_error: 0.2161 - val_mean_squared_error: 0.1056
Epoch 390/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.1636 - mean_absolute_error: 0.2707 - mean_squared_error: 0.1636 - val_loss: 0.5554 - val_mean_absolute_error: 0.5929 - val_mean_squared_error: 0.5554
Epoch 391/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.2073 - mean_absolute_error: 0.3226 - mean_squared_error: 0.2073 - val_loss: 0.1557 - val_mean_absolute_error: 0.2849 - val_mean_squared_error: 0.1557
Epoch 392/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.1342 - mean_absolute_error: 0.2538 - mean_squared_error: 0.1342 - val_loss: 0.1365 - val_mean_absolute_error: 0.2762 - val_mean_squared_error: 0.1365
Epoch 393/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.1080 - mean_absolute_error: 0.2350 - mean_squared_error: 0.1080 - val_loss: 0.1078 - val_mean_absolute_error: 0.2290 - val_mean_squared_error: 0.1078
Epoch 394/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0947 - mean_absolute_error: 0.2148 - mean_squared_error: 0.0947 - val_loss: 0.1050 - val_mean_absolute_error: 0.2215 - val_mean_squared_error: 0.1050
Epoch 395/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.1052 - mean_absolute_error: 0.2288 - mean_squared_error: 0.1052 - val_loss: 0.1620 - val_mean_absolute_error: 0.2872 - val_mean_squared_error: 0.1620
Epoch 396/1000
8000/8000 [==============================] - 1s 135us/step - loss: 0.1133 - mean_absolute_error: 0.2380 - mean_squared_error: 0.1133 - val_loss: 0.1015 - val_mean_absolute_error: 0.2170 - val_mean_squared_error: 0.1015
Epoch 397/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.1182 - mean_absolute_error: 0.2387 - mean_squared_error: 0.1182 - val_loss: 0.0997 - val_mean_absolute_error: 0.2164 - val_mean_squared_error: 0.0997
Epoch 398/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.1016 - mean_absolute_error: 0.2200 - mean_squared_error: 0.1016 - val_loss: 0.1516 - val_mean_absolute_error: 0.2707 - val_mean_squared_error: 0.1516
Epoch 399/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.1371 - mean_absolute_error: 0.2600 - mean_squared_error: 0.1371 - val_loss: 0.0960 - val_mean_absolute_error: 0.2050 - val_mean_squared_error: 0.0960
Epoch 400/1000
8000/8000 [==============================] - 1s 140us/step - loss: 0.1276 - mean_absolute_error: 0.2468 - mean_squared_error: 0.1276 - val_loss: 0.1028 - val_mean_absolute_error: 0.2097 - val_mean_squared_error: 0.1028
Epoch 401/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.1052 - mean_absolute_error: 0.2295 - mean_squared_error: 0.1052 - val_loss: 0.1022 - val_mean_absolute_error: 0.2121 - val_mean_squared_error: 0.1022
Epoch 402/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.1140 - mean_absolute_error: 0.2379 - mean_squared_error: 0.1140 - val_loss: 0.1377 - val_mean_absolute_error: 0.2752 - val_mean_squared_error: 0.1377
Epoch 403/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.1101 - mean_absolute_error: 0.2290 - mean_squared_error: 0.1101 - val_loss: 0.0955 - val_mean_absolute_error: 0.2054 - val_mean_squared_error: 0.0955
Epoch 404/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.1169 - mean_absolute_error: 0.2361 - mean_squared_error: 0.1169 - val_loss: 0.0938 - val_mean_absolute_error: 0.2016 - val_mean_squared_error: 0.0938
Epoch 405/1000
8000/8000 [==============================] - 1s 142us/step - loss: 0.0917 - mean_absolute_error: 0.2074 - mean_squared_error: 0.0917 - val_loss: 0.0941 - val_mean_absolute_error: 0.2041 - val_mean_squared_error: 0.0941
Epoch 406/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.1942 - mean_absolute_error: 0.2957 - mean_squared_error: 0.1942 - val_loss: 0.1214 - val_mean_absolute_error: 0.2460 - val_mean_squared_error: 0.1214
Epoch 407/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.1074 - mean_absolute_error: 0.2279 - mean_squared_error: 0.1074 - val_loss: 0.1209 - val_mean_absolute_error: 0.2408 - val_mean_squared_error: 0.1209
Epoch 408/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.0957 - mean_absolute_error: 0.2146 - mean_squared_error: 0.0957 - val_loss: 0.1166 - val_mean_absolute_error: 0.2449 - val_mean_squared_error: 0.1166
Epoch 409/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.1558 - mean_absolute_error: 0.2875 - mean_squared_error: 0.1558 - val_loss: 0.1224 - val_mean_absolute_error: 0.2416 - val_mean_squared_error: 0.1224
Epoch 410/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.1084 - mean_absolute_error: 0.2320 - mean_squared_error: 0.1084 - val_loss: 0.1238 - val_mean_absolute_error: 0.2470 - val_mean_squared_error: 0.1238
Epoch 411/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.1118 - mean_absolute_error: 0.2310 - mean_squared_error: 0.1118 - val_loss: 0.2839 - val_mean_absolute_error: 0.3905 - val_mean_squared_error: 0.2839
Epoch 412/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.1510 - mean_absolute_error: 0.2873 - mean_squared_error: 0.1510 - val_loss: 0.1062 - val_mean_absolute_error: 0.2190 - val_mean_squared_error: 0.1062
Epoch 413/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.0987 - mean_absolute_error: 0.2205 - mean_squared_error: 0.0987 - val_loss: 0.1160 - val_mean_absolute_error: 0.2291 - val_mean_squared_error: 0.1160
Epoch 414/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.2008 - mean_absolute_error: 0.2971 - mean_squared_error: 0.2008 - val_loss: 0.3790 - val_mean_absolute_error: 0.4446 - val_mean_squared_error: 0.3790
Epoch 415/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.1929 - mean_absolute_error: 0.3115 - mean_squared_error: 0.1929 - val_loss: 0.1964 - val_mean_absolute_error: 0.2894 - val_mean_squared_error: 0.1964
Epoch 416/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.1396 - mean_absolute_error: 0.2561 - mean_squared_error: 0.1396 - val_loss: 0.1108 - val_mean_absolute_error: 0.2281 - val_mean_squared_error: 0.1108
Epoch 417/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0987 - mean_absolute_error: 0.2205 - mean_squared_error: 0.0987 - val_loss: 0.1000 - val_mean_absolute_error: 0.2131 - val_mean_squared_error: 0.1000
Epoch 418/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.1192 - mean_absolute_error: 0.2534 - mean_squared_error: 0.1192 - val_loss: 0.1080 - val_mean_absolute_error: 0.2241 - val_mean_squared_error: 0.1080
Epoch 419/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.1013 - mean_absolute_error: 0.2249 - mean_squared_error: 0.1013 - val_loss: 0.0977 - val_mean_absolute_error: 0.2079 - val_mean_squared_error: 0.0977
Epoch 420/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0993 - mean_absolute_error: 0.2200 - mean_squared_error: 0.0993 - val_loss: 0.1205 - val_mean_absolute_error: 0.2330 - val_mean_squared_error: 0.1205
Epoch 421/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.1001 - mean_absolute_error: 0.2176 - mean_squared_error: 0.1001 - val_loss: 0.0996 - val_mean_absolute_error: 0.2143 - val_mean_squared_error: 0.0996
Epoch 422/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.1361 - mean_absolute_error: 0.2599 - mean_squared_error: 0.1361 - val_loss: 0.1109 - val_mean_absolute_error: 0.2282 - val_mean_squared_error: 0.1109
Epoch 423/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.0983 - mean_absolute_error: 0.2166 - mean_squared_error: 0.0983 - val_loss: 0.1579 - val_mean_absolute_error: 0.2734 - val_mean_squared_error: 0.1579
Epoch 424/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.1767 - mean_absolute_error: 0.2912 - mean_squared_error: 0.1767 - val_loss: 0.1275 - val_mean_absolute_error: 0.2663 - val_mean_squared_error: 0.1275
Epoch 425/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.1063 - mean_absolute_error: 0.2347 - mean_squared_error: 0.1063 - val_loss: 0.0965 - val_mean_absolute_error: 0.2096 - val_mean_squared_error: 0.0965
Epoch 426/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0954 - mean_absolute_error: 0.2187 - mean_squared_error: 0.0954 - val_loss: 0.0998 - val_mean_absolute_error: 0.2231 - val_mean_squared_error: 0.0998
Epoch 427/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.0926 - mean_absolute_error: 0.2155 - mean_squared_error: 0.0926 - val_loss: 0.1091 - val_mean_absolute_error: 0.2200 - val_mean_squared_error: 0.1091
Epoch 428/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.1263 - mean_absolute_error: 0.2514 - mean_squared_error: 0.1263 - val_loss: 0.1047 - val_mean_absolute_error: 0.2160 - val_mean_squared_error: 0.1047
Epoch 429/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.1139 - mean_absolute_error: 0.2355 - mean_squared_error: 0.1139 - val_loss: 0.1035 - val_mean_absolute_error: 0.2184 - val_mean_squared_error: 0.1035
Epoch 430/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.1044 - mean_absolute_error: 0.2338 - mean_squared_error: 0.1044 - val_loss: 0.0983 - val_mean_absolute_error: 0.2093 - val_mean_squared_error: 0.0983
Epoch 431/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.2244 - mean_absolute_error: 0.3297 - mean_squared_error: 0.2244 - val_loss: 0.1974 - val_mean_absolute_error: 0.3066 - val_mean_squared_error: 0.1974
Epoch 432/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.1386 - mean_absolute_error: 0.2626 - mean_squared_error: 0.1386 - val_loss: 0.1333 - val_mean_absolute_error: 0.2553 - val_mean_squared_error: 0.1333
Epoch 433/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.1086 - mean_absolute_error: 0.2325 - mean_squared_error: 0.1086 - val_loss: 0.1125 - val_mean_absolute_error: 0.2385 - val_mean_squared_error: 0.1125
Epoch 434/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.1118 - mean_absolute_error: 0.2440 - mean_squared_error: 0.1118 - val_loss: 0.1330 - val_mean_absolute_error: 0.2664 - val_mean_squared_error: 0.1330
Epoch 435/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.1097 - mean_absolute_error: 0.2377 - mean_squared_error: 0.1097 - val_loss: 0.1106 - val_mean_absolute_error: 0.2245 - val_mean_squared_error: 0.1106
Epoch 436/1000
8000/8000 [==============================] - 1s 135us/step - loss: 0.0993 - mean_absolute_error: 0.2178 - mean_squared_error: 0.0993 - val_loss: 0.0950 - val_mean_absolute_error: 0.2094 - val_mean_squared_error: 0.0950
Epoch 437/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.1057 - mean_absolute_error: 0.2234 - mean_squared_error: 0.1057 - val_loss: 0.0927 - val_mean_absolute_error: 0.2019 - val_mean_squared_error: 0.0927
Epoch 438/1000
8000/8000 [==============================] - 1s 140us/step - loss: 0.0913 - mean_absolute_error: 0.2068 - mean_squared_error: 0.0913 - val_loss: 0.0921 - val_mean_absolute_error: 0.2012 - val_mean_squared_error: 0.0921
Epoch 439/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.2995 - mean_absolute_error: 0.3470 - mean_squared_error: 0.2995 - val_loss: 0.1677 - val_mean_absolute_error: 0.2892 - val_mean_squared_error: 0.1677
Epoch 440/1000
8000/8000 [==============================] - 1s 141us/step - loss: 0.1686 - mean_absolute_error: 0.2791 - mean_squared_error: 0.1686 - val_loss: 0.1407 - val_mean_absolute_error: 0.2440 - val_mean_squared_error: 0.1407
Epoch 441/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.1165 - mean_absolute_error: 0.2302 - mean_squared_error: 0.1165 - val_loss: 0.1084 - val_mean_absolute_error: 0.2218 - val_mean_squared_error: 0.1084
Epoch 442/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.1036 - mean_absolute_error: 0.2266 - mean_squared_error: 0.1036 - val_loss: 0.1075 - val_mean_absolute_error: 0.2349 - val_mean_squared_error: 0.1075
Epoch 443/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.1054 - mean_absolute_error: 0.2340 - mean_squared_error: 0.1054 - val_loss: 0.1022 - val_mean_absolute_error: 0.2191 - val_mean_squared_error: 0.1022
Epoch 444/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.1067 - mean_absolute_error: 0.2367 - mean_squared_error: 0.1067 - val_loss: 0.1240 - val_mean_absolute_error: 0.2695 - val_mean_squared_error: 0.1240
Epoch 445/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0904 - mean_absolute_error: 0.2108 - mean_squared_error: 0.0904 - val_loss: 0.0951 - val_mean_absolute_error: 0.2081 - val_mean_squared_error: 0.0951
Epoch 446/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0871 - mean_absolute_error: 0.2054 - mean_squared_error: 0.0871 - val_loss: 0.0933 - val_mean_absolute_error: 0.2067 - val_mean_squared_error: 0.0933
Epoch 447/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.1694 - mean_absolute_error: 0.2901 - mean_squared_error: 0.1694 - val_loss: 0.1765 - val_mean_absolute_error: 0.3008 - val_mean_squared_error: 0.1765
Epoch 448/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.1237 - mean_absolute_error: 0.2467 - mean_squared_error: 0.1237 - val_loss: 0.1293 - val_mean_absolute_error: 0.2525 - val_mean_squared_error: 0.1293
Epoch 449/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.1045 - mean_absolute_error: 0.2352 - mean_squared_error: 0.1045 - val_loss: 0.0929 - val_mean_absolute_error: 0.2043 - val_mean_squared_error: 0.0929
Epoch 450/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.1032 - mean_absolute_error: 0.2321 - mean_squared_error: 0.1032 - val_loss: 0.1460 - val_mean_absolute_error: 0.2906 - val_mean_squared_error: 0.1460
Epoch 451/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.1308 - mean_absolute_error: 0.2561 - mean_squared_error: 0.1308 - val_loss: 0.2026 - val_mean_absolute_error: 0.3445 - val_mean_squared_error: 0.2026
Epoch 452/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.1152 - mean_absolute_error: 0.2409 - mean_squared_error: 0.1152 - val_loss: 0.1134 - val_mean_absolute_error: 0.2236 - val_mean_squared_error: 0.1134
Epoch 453/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.1039 - mean_absolute_error: 0.2307 - mean_squared_error: 0.1039 - val_loss: 0.1006 - val_mean_absolute_error: 0.2290 - val_mean_squared_error: 0.1006
Epoch 454/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.1697 - mean_absolute_error: 0.2738 - mean_squared_error: 0.1697 - val_loss: 0.4923 - val_mean_absolute_error: 0.4947 - val_mean_squared_error: 0.4923
Epoch 455/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.1603 - mean_absolute_error: 0.2704 - mean_squared_error: 0.1603 - val_loss: 0.1103 - val_mean_absolute_error: 0.2148 - val_mean_squared_error: 0.1103
Epoch 456/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.1017 - mean_absolute_error: 0.2200 - mean_squared_error: 0.1017 - val_loss: 0.1035 - val_mean_absolute_error: 0.2285 - val_mean_squared_error: 0.1035
Epoch 457/1000
8000/8000 [==============================] - 1s 140us/step - loss: 0.0946 - mean_absolute_error: 0.2188 - mean_squared_error: 0.0946 - val_loss: 0.1201 - val_mean_absolute_error: 0.2537 - val_mean_squared_error: 0.1201
Epoch 458/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.0965 - mean_absolute_error: 0.2189 - mean_squared_error: 0.0965 - val_loss: 0.1508 - val_mean_absolute_error: 0.2842 - val_mean_squared_error: 0.1508
Epoch 459/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.1196 - mean_absolute_error: 0.2574 - mean_squared_error: 0.1196 - val_loss: 0.1415 - val_mean_absolute_error: 0.2636 - val_mean_squared_error: 0.1415
Epoch 460/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.1145 - mean_absolute_error: 0.2334 - mean_squared_error: 0.1145 - val_loss: 0.0948 - val_mean_absolute_error: 0.2012 - val_mean_squared_error: 0.0948
Epoch 461/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0833 - mean_absolute_error: 0.1976 - mean_squared_error: 0.0833 - val_loss: 0.1050 - val_mean_absolute_error: 0.2160 - val_mean_squared_error: 0.1050
Epoch 462/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.1151 - mean_absolute_error: 0.2384 - mean_squared_error: 0.1151 - val_loss: 0.2805 - val_mean_absolute_error: 0.3663 - val_mean_squared_error: 0.2805
Epoch 463/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.1488 - mean_absolute_error: 0.2770 - mean_squared_error: 0.1488 - val_loss: 0.0981 - val_mean_absolute_error: 0.2084 - val_mean_squared_error: 0.0981
Epoch 464/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0978 - mean_absolute_error: 0.2241 - mean_squared_error: 0.0978 - val_loss: 0.1518 - val_mean_absolute_error: 0.2749 - val_mean_squared_error: 0.1518
Epoch 465/1000
8000/8000 [==============================] - 1s 141us/step - loss: 0.1270 - mean_absolute_error: 0.2489 - mean_squared_error: 0.1270 - val_loss: 0.0873 - val_mean_absolute_error: 0.1938 - val_mean_squared_error: 0.0873
Epoch 466/1000
8000/8000 [==============================] - 1s 142us/step - loss: 0.0895 - mean_absolute_error: 0.2063 - mean_squared_error: 0.0895 - val_loss: 0.1122 - val_mean_absolute_error: 0.2330 - val_mean_squared_error: 0.1122
Epoch 467/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.0864 - mean_absolute_error: 0.2050 - mean_squared_error: 0.0864 - val_loss: 0.0916 - val_mean_absolute_error: 0.2097 - val_mean_squared_error: 0.0916
Epoch 468/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0840 - mean_absolute_error: 0.2020 - mean_squared_error: 0.0840 - val_loss: 0.1621 - val_mean_absolute_error: 0.2847 - val_mean_squared_error: 0.1621
Epoch 469/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.3067 - mean_absolute_error: 0.3813 - mean_squared_error: 0.3067 - val_loss: 0.1320 - val_mean_absolute_error: 0.2414 - val_mean_squared_error: 0.1320
Epoch 470/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.1326 - mean_absolute_error: 0.2464 - mean_squared_error: 0.1326 - val_loss: 0.1081 - val_mean_absolute_error: 0.2112 - val_mean_squared_error: 0.1081
Epoch 471/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.0954 - mean_absolute_error: 0.2116 - mean_squared_error: 0.0954 - val_loss: 0.0985 - val_mean_absolute_error: 0.2182 - val_mean_squared_error: 0.0985
Epoch 472/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0904 - mean_absolute_error: 0.2107 - mean_squared_error: 0.0904 - val_loss: 0.1149 - val_mean_absolute_error: 0.2437 - val_mean_squared_error: 0.1149
Epoch 473/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0918 - mean_absolute_error: 0.2183 - mean_squared_error: 0.0918 - val_loss: 0.1007 - val_mean_absolute_error: 0.2129 - val_mean_squared_error: 0.1007
Epoch 474/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.0879 - mean_absolute_error: 0.2104 - mean_squared_error: 0.0879 - val_loss: 0.1160 - val_mean_absolute_error: 0.2418 - val_mean_squared_error: 0.1160
Epoch 475/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.0968 - mean_absolute_error: 0.2177 - mean_squared_error: 0.0968 - val_loss: 0.0922 - val_mean_absolute_error: 0.2130 - val_mean_squared_error: 0.0922
Epoch 476/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0951 - mean_absolute_error: 0.2189 - mean_squared_error: 0.0951 - val_loss: 0.0899 - val_mean_absolute_error: 0.2048 - val_mean_squared_error: 0.0899
Epoch 477/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0907 - mean_absolute_error: 0.2072 - mean_squared_error: 0.0907 - val_loss: 0.2816 - val_mean_absolute_error: 0.3809 - val_mean_squared_error: 0.2816
Epoch 478/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.2816 - mean_absolute_error: 0.3537 - mean_squared_error: 0.2816 - val_loss: 0.1445 - val_mean_absolute_error: 0.2372 - val_mean_squared_error: 0.1445
Epoch 479/1000
8000/8000 [==============================] - 1s 140us/step - loss: 0.1251 - mean_absolute_error: 0.2373 - mean_squared_error: 0.1251 - val_loss: 0.1126 - val_mean_absolute_error: 0.2240 - val_mean_squared_error: 0.1126
Epoch 480/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.1007 - mean_absolute_error: 0.2202 - mean_squared_error: 0.1007 - val_loss: 0.1295 - val_mean_absolute_error: 0.2525 - val_mean_squared_error: 0.1295
Epoch 481/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.1408 - mean_absolute_error: 0.2691 - mean_squared_error: 0.1408 - val_loss: 0.1048 - val_mean_absolute_error: 0.2191 - val_mean_squared_error: 0.1048
Epoch 482/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.0927 - mean_absolute_error: 0.2146 - mean_squared_error: 0.0927 - val_loss: 0.1108 - val_mean_absolute_error: 0.2276 - val_mean_squared_error: 0.1108
Epoch 483/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0849 - mean_absolute_error: 0.2026 - mean_squared_error: 0.0849 - val_loss: 0.1205 - val_mean_absolute_error: 0.2576 - val_mean_squared_error: 0.1205
Epoch 484/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.1003 - mean_absolute_error: 0.2210 - mean_squared_error: 0.1003 - val_loss: 0.0854 - val_mean_absolute_error: 0.1939 - val_mean_squared_error: 0.0854
Epoch 485/1000
8000/8000 [==============================] - 1s 142us/step - loss: 0.0783 - mean_absolute_error: 0.1929 - mean_squared_error: 0.0783 - val_loss: 0.0861 - val_mean_absolute_error: 0.1987 - val_mean_squared_error: 0.0861
Epoch 486/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.1583 - mean_absolute_error: 0.2723 - mean_squared_error: 0.1583 - val_loss: 0.1131 - val_mean_absolute_error: 0.2349 - val_mean_squared_error: 0.1131
Epoch 487/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0952 - mean_absolute_error: 0.2113 - mean_squared_error: 0.0952 - val_loss: 0.1030 - val_mean_absolute_error: 0.2237 - val_mean_squared_error: 0.1030
Epoch 488/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.0879 - mean_absolute_error: 0.2053 - mean_squared_error: 0.0879 - val_loss: 0.1181 - val_mean_absolute_error: 0.2616 - val_mean_squared_error: 0.1181
Epoch 489/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.0991 - mean_absolute_error: 0.2267 - mean_squared_error: 0.0991 - val_loss: 0.0940 - val_mean_absolute_error: 0.2058 - val_mean_squared_error: 0.0940
Epoch 490/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0881 - mean_absolute_error: 0.2040 - mean_squared_error: 0.0881 - val_loss: 0.0814 - val_mean_absolute_error: 0.1895 - val_mean_squared_error: 0.0814
Epoch 491/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.1144 - mean_absolute_error: 0.2274 - mean_squared_error: 0.1144 - val_loss: 0.1126 - val_mean_absolute_error: 0.2243 - val_mean_squared_error: 0.1126
Epoch 492/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.0871 - mean_absolute_error: 0.2044 - mean_squared_error: 0.0871 - val_loss: 0.0837 - val_mean_absolute_error: 0.1936 - val_mean_squared_error: 0.0837
Epoch 493/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0972 - mean_absolute_error: 0.2134 - mean_squared_error: 0.0972 - val_loss: 0.1754 - val_mean_absolute_error: 0.2896 - val_mean_squared_error: 0.1754
Epoch 494/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.1187 - mean_absolute_error: 0.2481 - mean_squared_error: 0.1187 - val_loss: 0.1321 - val_mean_absolute_error: 0.2690 - val_mean_squared_error: 0.1321
Epoch 495/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.0912 - mean_absolute_error: 0.2166 - mean_squared_error: 0.0912 - val_loss: 0.1033 - val_mean_absolute_error: 0.2324 - val_mean_squared_error: 0.1033
Epoch 496/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.0833 - mean_absolute_error: 0.1988 - mean_squared_error: 0.0833 - val_loss: 0.1458 - val_mean_absolute_error: 0.2709 - val_mean_squared_error: 0.1458
Epoch 497/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.1916 - mean_absolute_error: 0.3002 - mean_squared_error: 0.1916 - val_loss: 0.1403 - val_mean_absolute_error: 0.2718 - val_mean_squared_error: 0.1403
Epoch 498/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.1003 - mean_absolute_error: 0.2173 - mean_squared_error: 0.1003 - val_loss: 0.0986 - val_mean_absolute_error: 0.2144 - val_mean_squared_error: 0.0986
Epoch 499/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0903 - mean_absolute_error: 0.2121 - mean_squared_error: 0.0903 - val_loss: 0.1116 - val_mean_absolute_error: 0.2293 - val_mean_squared_error: 0.1116
Epoch 500/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.0941 - mean_absolute_error: 0.2183 - mean_squared_error: 0.0941 - val_loss: 0.1167 - val_mean_absolute_error: 0.2501 - val_mean_squared_error: 0.1167
Epoch 501/1000
8000/8000 [==============================] - 1s 135us/step - loss: 0.0948 - mean_absolute_error: 0.2197 - mean_squared_error: 0.0948 - val_loss: 0.0942 - val_mean_absolute_error: 0.2171 - val_mean_squared_error: 0.0942
Epoch 502/1000
8000/8000 [==============================] - 1s 135us/step - loss: 0.0863 - mean_absolute_error: 0.2023 - mean_squared_error: 0.0863 - val_loss: 0.0932 - val_mean_absolute_error: 0.2060 - val_mean_squared_error: 0.0932
Epoch 503/1000
8000/8000 [==============================] - 1s 134us/step - loss: 0.1001 - mean_absolute_error: 0.2200 - mean_squared_error: 0.1001 - val_loss: 0.0941 - val_mean_absolute_error: 0.2098 - val_mean_squared_error: 0.0941
Epoch 504/1000
8000/8000 [==============================] - 1s 135us/step - loss: 0.0849 - mean_absolute_error: 0.2012 - mean_squared_error: 0.0849 - val_loss: 0.0990 - val_mean_absolute_error: 0.2168 - val_mean_squared_error: 0.0990
Epoch 505/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.1447 - mean_absolute_error: 0.2609 - mean_squared_error: 0.1447 - val_loss: 0.0922 - val_mean_absolute_error: 0.2140 - val_mean_squared_error: 0.0922
Epoch 506/1000
8000/8000 [==============================] - 1s 134us/step - loss: 0.0910 - mean_absolute_error: 0.2078 - mean_squared_error: 0.0910 - val_loss: 0.0866 - val_mean_absolute_error: 0.2000 - val_mean_squared_error: 0.0866
Epoch 507/1000
8000/8000 [==============================] - 1s 134us/step - loss: 0.0756 - mean_absolute_error: 0.1884 - mean_squared_error: 0.0756 - val_loss: 0.1576 - val_mean_absolute_error: 0.2683 - val_mean_squared_error: 0.1576
Epoch 508/1000
8000/8000 [==============================] - 1s 134us/step - loss: 0.1723 - mean_absolute_error: 0.2892 - mean_squared_error: 0.1723 - val_loss: 0.0968 - val_mean_absolute_error: 0.2025 - val_mean_squared_error: 0.0968
Epoch 509/1000
8000/8000 [==============================] - 1s 135us/step - loss: 0.0959 - mean_absolute_error: 0.2130 - mean_squared_error: 0.0959 - val_loss: 0.0962 - val_mean_absolute_error: 0.2062 - val_mean_squared_error: 0.0962
Epoch 510/1000
8000/8000 [==============================] - 1s 135us/step - loss: 0.0809 - mean_absolute_error: 0.1994 - mean_squared_error: 0.0809 - val_loss: 0.1189 - val_mean_absolute_error: 0.2493 - val_mean_squared_error: 0.1189
Epoch 511/1000
8000/8000 [==============================] - 1s 140us/step - loss: 0.1381 - mean_absolute_error: 0.2737 - mean_squared_error: 0.1381 - val_loss: 0.1229 - val_mean_absolute_error: 0.2475 - val_mean_squared_error: 0.1229
Epoch 512/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.1070 - mean_absolute_error: 0.2432 - mean_squared_error: 0.1070 - val_loss: 0.1099 - val_mean_absolute_error: 0.2257 - val_mean_squared_error: 0.1099
Epoch 513/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0947 - mean_absolute_error: 0.2118 - mean_squared_error: 0.0947 - val_loss: 0.0801 - val_mean_absolute_error: 0.1881 - val_mean_squared_error: 0.0801
Epoch 514/1000
8000/8000 [==============================] - 1s 143us/step - loss: 0.0951 - mean_absolute_error: 0.2232 - mean_squared_error: 0.0951 - val_loss: 0.0942 - val_mean_absolute_error: 0.2148 - val_mean_squared_error: 0.0942
Epoch 515/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0802 - mean_absolute_error: 0.1943 - mean_squared_error: 0.0802 - val_loss: 0.0924 - val_mean_absolute_error: 0.2062 - val_mean_squared_error: 0.0924
Epoch 516/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.1262 - mean_absolute_error: 0.2436 - mean_squared_error: 0.1262 - val_loss: 0.1013 - val_mean_absolute_error: 0.2163 - val_mean_squared_error: 0.1013
Epoch 517/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.0848 - mean_absolute_error: 0.1998 - mean_squared_error: 0.0848 - val_loss: 0.0830 - val_mean_absolute_error: 0.1911 - val_mean_squared_error: 0.0830
Epoch 518/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.0815 - mean_absolute_error: 0.1976 - mean_squared_error: 0.0815 - val_loss: 0.1157 - val_mean_absolute_error: 0.2365 - val_mean_squared_error: 0.1157
Epoch 519/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.1614 - mean_absolute_error: 0.2817 - mean_squared_error: 0.1614 - val_loss: 0.1339 - val_mean_absolute_error: 0.2628 - val_mean_squared_error: 0.1339
Epoch 520/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0925 - mean_absolute_error: 0.2104 - mean_squared_error: 0.0925 - val_loss: 0.0947 - val_mean_absolute_error: 0.2085 - val_mean_squared_error: 0.0947
Epoch 521/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.0866 - mean_absolute_error: 0.2091 - mean_squared_error: 0.0866 - val_loss: 0.0812 - val_mean_absolute_error: 0.1925 - val_mean_squared_error: 0.0812
Epoch 522/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.1077 - mean_absolute_error: 0.2349 - mean_squared_error: 0.1077 - val_loss: 0.1221 - val_mean_absolute_error: 0.2351 - val_mean_squared_error: 0.1221
Epoch 523/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.1092 - mean_absolute_error: 0.2382 - mean_squared_error: 0.1092 - val_loss: 0.0813 - val_mean_absolute_error: 0.1864 - val_mean_squared_error: 0.0813
Epoch 524/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0761 - mean_absolute_error: 0.1898 - mean_squared_error: 0.0761 - val_loss: 0.0783 - val_mean_absolute_error: 0.1879 - val_mean_squared_error: 0.0783
Epoch 525/1000
8000/8000 [==============================] - 1s 143us/step - loss: 0.1091 - mean_absolute_error: 0.2291 - mean_squared_error: 0.1091 - val_loss: 0.2333 - val_mean_absolute_error: 0.3524 - val_mean_squared_error: 0.2333
Epoch 526/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.1250 - mean_absolute_error: 0.2532 - mean_squared_error: 0.1250 - val_loss: 0.0990 - val_mean_absolute_error: 0.2233 - val_mean_squared_error: 0.0990
Epoch 527/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0968 - mean_absolute_error: 0.2136 - mean_squared_error: 0.0968 - val_loss: 0.0822 - val_mean_absolute_error: 0.1899 - val_mean_squared_error: 0.0822
Epoch 528/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.1121 - mean_absolute_error: 0.2297 - mean_squared_error: 0.1121 - val_loss: 0.1754 - val_mean_absolute_error: 0.2813 - val_mean_squared_error: 0.1754
Epoch 529/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.1057 - mean_absolute_error: 0.2213 - mean_squared_error: 0.1057 - val_loss: 0.1176 - val_mean_absolute_error: 0.2378 - val_mean_squared_error: 0.1176
Epoch 530/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0819 - mean_absolute_error: 0.1970 - mean_squared_error: 0.0819 - val_loss: 0.0817 - val_mean_absolute_error: 0.1885 - val_mean_squared_error: 0.0817
Epoch 531/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.0904 - mean_absolute_error: 0.2117 - mean_squared_error: 0.0904 - val_loss: 0.1642 - val_mean_absolute_error: 0.2882 - val_mean_squared_error: 0.1642
Epoch 532/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0939 - mean_absolute_error: 0.2151 - mean_squared_error: 0.0939 - val_loss: 0.0878 - val_mean_absolute_error: 0.2108 - val_mean_squared_error: 0.0878
Epoch 533/1000
8000/8000 [==============================] - 1s 140us/step - loss: 0.0753 - mean_absolute_error: 0.1891 - mean_squared_error: 0.0753 - val_loss: 0.0788 - val_mean_absolute_error: 0.1862 - val_mean_squared_error: 0.0788
Epoch 534/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.2074 - mean_absolute_error: 0.3023 - mean_squared_error: 0.2074 - val_loss: 0.1714 - val_mean_absolute_error: 0.2922 - val_mean_squared_error: 0.1714
Epoch 535/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.1138 - mean_absolute_error: 0.2277 - mean_squared_error: 0.1138 - val_loss: 0.1080 - val_mean_absolute_error: 0.2324 - val_mean_squared_error: 0.1080
Epoch 536/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0874 - mean_absolute_error: 0.2053 - mean_squared_error: 0.0874 - val_loss: 0.0973 - val_mean_absolute_error: 0.2135 - val_mean_squared_error: 0.0973
Epoch 537/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.1008 - mean_absolute_error: 0.2311 - mean_squared_error: 0.1008 - val_loss: 0.0928 - val_mean_absolute_error: 0.2033 - val_mean_squared_error: 0.0928
Epoch 538/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0782 - mean_absolute_error: 0.1954 - mean_squared_error: 0.0782 - val_loss: 0.0811 - val_mean_absolute_error: 0.1946 - val_mean_squared_error: 0.0811
Epoch 539/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.1002 - mean_absolute_error: 0.2222 - mean_squared_error: 0.1002 - val_loss: 0.1382 - val_mean_absolute_error: 0.2663 - val_mean_squared_error: 0.1382
Epoch 540/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0955 - mean_absolute_error: 0.2182 - mean_squared_error: 0.0955 - val_loss: 0.0937 - val_mean_absolute_error: 0.2053 - val_mean_squared_error: 0.0937
Epoch 541/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0858 - mean_absolute_error: 0.2067 - mean_squared_error: 0.0858 - val_loss: 0.0797 - val_mean_absolute_error: 0.1902 - val_mean_squared_error: 0.0797
Epoch 542/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.1782 - mean_absolute_error: 0.2699 - mean_squared_error: 0.1782 - val_loss: 0.5576 - val_mean_absolute_error: 0.5449 - val_mean_squared_error: 0.5576
Epoch 543/1000
8000/8000 [==============================] - 1s 140us/step - loss: 0.1875 - mean_absolute_error: 0.2995 - mean_squared_error: 0.1875 - val_loss: 0.1394 - val_mean_absolute_error: 0.2651 - val_mean_squared_error: 0.1394
Epoch 544/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.0998 - mean_absolute_error: 0.2194 - mean_squared_error: 0.0998 - val_loss: 0.0972 - val_mean_absolute_error: 0.2149 - val_mean_squared_error: 0.0972
Epoch 545/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.0869 - mean_absolute_error: 0.2089 - mean_squared_error: 0.0869 - val_loss: 0.1011 - val_mean_absolute_error: 0.2203 - val_mean_squared_error: 0.1011
Epoch 546/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.0922 - mean_absolute_error: 0.2200 - mean_squared_error: 0.0922 - val_loss: 0.0930 - val_mean_absolute_error: 0.2194 - val_mean_squared_error: 0.0930
Epoch 547/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.1029 - mean_absolute_error: 0.2310 - mean_squared_error: 0.1029 - val_loss: 0.1300 - val_mean_absolute_error: 0.2485 - val_mean_squared_error: 0.1300
Epoch 548/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0947 - mean_absolute_error: 0.2161 - mean_squared_error: 0.0947 - val_loss: 0.0867 - val_mean_absolute_error: 0.2031 - val_mean_squared_error: 0.0867
Epoch 549/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0738 - mean_absolute_error: 0.1877 - mean_squared_error: 0.0738 - val_loss: 0.0985 - val_mean_absolute_error: 0.2157 - val_mean_squared_error: 0.0985
Epoch 550/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.1150 - mean_absolute_error: 0.2460 - mean_squared_error: 0.1150 - val_loss: 0.1181 - val_mean_absolute_error: 0.2354 - val_mean_squared_error: 0.1181
Epoch 551/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0896 - mean_absolute_error: 0.2094 - mean_squared_error: 0.0896 - val_loss: 0.0861 - val_mean_absolute_error: 0.1980 - val_mean_squared_error: 0.0861
Epoch 552/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0750 - mean_absolute_error: 0.1884 - mean_squared_error: 0.0750 - val_loss: 0.1075 - val_mean_absolute_error: 0.2278 - val_mean_squared_error: 0.1075
Epoch 553/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0860 - mean_absolute_error: 0.2029 - mean_squared_error: 0.0860 - val_loss: 0.1013 - val_mean_absolute_error: 0.2212 - val_mean_squared_error: 0.1013
Epoch 554/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.0999 - mean_absolute_error: 0.2169 - mean_squared_error: 0.0999 - val_loss: 0.1117 - val_mean_absolute_error: 0.2304 - val_mean_squared_error: 0.1117
Epoch 555/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0885 - mean_absolute_error: 0.2095 - mean_squared_error: 0.0885 - val_loss: 0.1169 - val_mean_absolute_error: 0.2346 - val_mean_squared_error: 0.1169
Epoch 556/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.0962 - mean_absolute_error: 0.2137 - mean_squared_error: 0.0962 - val_loss: 0.0769 - val_mean_absolute_error: 0.1890 - val_mean_squared_error: 0.0769
Epoch 557/1000
8000/8000 [==============================] - 1s 145us/step - loss: 0.0860 - mean_absolute_error: 0.2059 - mean_squared_error: 0.0860 - val_loss: 0.1201 - val_mean_absolute_error: 0.2379 - val_mean_squared_error: 0.1201
Epoch 558/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0841 - mean_absolute_error: 0.2025 - mean_squared_error: 0.0841 - val_loss: 0.0743 - val_mean_absolute_error: 0.1813 - val_mean_squared_error: 0.0743
Epoch 559/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.0939 - mean_absolute_error: 0.2071 - mean_squared_error: 0.0939 - val_loss: 0.0827 - val_mean_absolute_error: 0.1953 - val_mean_squared_error: 0.0827
Epoch 560/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0779 - mean_absolute_error: 0.1959 - mean_squared_error: 0.0779 - val_loss: 0.1134 - val_mean_absolute_error: 0.2305 - val_mean_squared_error: 0.1134
Epoch 561/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.1188 - mean_absolute_error: 0.2395 - mean_squared_error: 0.1188 - val_loss: 0.0787 - val_mean_absolute_error: 0.1848 - val_mean_squared_error: 0.0787
Epoch 562/1000
8000/8000 [==============================] - 1s 135us/step - loss: 0.0728 - mean_absolute_error: 0.1841 - mean_squared_error: 0.0728 - val_loss: 0.0797 - val_mean_absolute_error: 0.1900 - val_mean_squared_error: 0.0797
Epoch 563/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.1223 - mean_absolute_error: 0.2407 - mean_squared_error: 0.1223 - val_loss: 0.1217 - val_mean_absolute_error: 0.2432 - val_mean_squared_error: 0.1217
Epoch 564/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0908 - mean_absolute_error: 0.2096 - mean_squared_error: 0.0908 - val_loss: 0.0763 - val_mean_absolute_error: 0.1827 - val_mean_squared_error: 0.0763
Epoch 565/1000
8000/8000 [==============================] - 1s 135us/step - loss: 0.0764 - mean_absolute_error: 0.1893 - mean_squared_error: 0.0764 - val_loss: 0.0877 - val_mean_absolute_error: 0.2064 - val_mean_squared_error: 0.0877
Epoch 566/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.1306 - mean_absolute_error: 0.2482 - mean_squared_error: 0.1306 - val_loss: 0.1053 - val_mean_absolute_error: 0.2203 - val_mean_squared_error: 0.1053
Epoch 567/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0832 - mean_absolute_error: 0.1985 - mean_squared_error: 0.0832 - val_loss: 0.1135 - val_mean_absolute_error: 0.2298 - val_mean_squared_error: 0.1135
Epoch 568/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.1011 - mean_absolute_error: 0.2234 - mean_squared_error: 0.1011 - val_loss: 0.0881 - val_mean_absolute_error: 0.2128 - val_mean_squared_error: 0.0881
Epoch 569/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0785 - mean_absolute_error: 0.1939 - mean_squared_error: 0.0785 - val_loss: 0.0752 - val_mean_absolute_error: 0.1800 - val_mean_squared_error: 0.0752
Epoch 570/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0840 - mean_absolute_error: 0.1976 - mean_squared_error: 0.0840 - val_loss: 0.0776 - val_mean_absolute_error: 0.1845 - val_mean_squared_error: 0.0776
Epoch 571/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0858 - mean_absolute_error: 0.2063 - mean_squared_error: 0.0858 - val_loss: 0.0853 - val_mean_absolute_error: 0.2006 - val_mean_squared_error: 0.0853
Epoch 572/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0856 - mean_absolute_error: 0.2033 - mean_squared_error: 0.0856 - val_loss: 0.0868 - val_mean_absolute_error: 0.2057 - val_mean_squared_error: 0.0868
Epoch 573/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0922 - mean_absolute_error: 0.2160 - mean_squared_error: 0.0922 - val_loss: 0.0859 - val_mean_absolute_error: 0.2022 - val_mean_squared_error: 0.0859
Epoch 574/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.1048 - mean_absolute_error: 0.2329 - mean_squared_error: 0.1048 - val_loss: 0.0999 - val_mean_absolute_error: 0.2315 - val_mean_squared_error: 0.0999
Epoch 575/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0913 - mean_absolute_error: 0.2119 - mean_squared_error: 0.0913 - val_loss: 0.0968 - val_mean_absolute_error: 0.2216 - val_mean_squared_error: 0.0968
Epoch 576/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.1171 - mean_absolute_error: 0.2384 - mean_squared_error: 0.1171 - val_loss: 0.1043 - val_mean_absolute_error: 0.2244 - val_mean_squared_error: 0.1043
Epoch 577/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0916 - mean_absolute_error: 0.2141 - mean_squared_error: 0.0916 - val_loss: 0.0889 - val_mean_absolute_error: 0.2155 - val_mean_squared_error: 0.0889
Epoch 578/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0916 - mean_absolute_error: 0.2164 - mean_squared_error: 0.0916 - val_loss: 0.0896 - val_mean_absolute_error: 0.2133 - val_mean_squared_error: 0.0896
Epoch 579/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0876 - mean_absolute_error: 0.2095 - mean_squared_error: 0.0876 - val_loss: 0.1300 - val_mean_absolute_error: 0.2397 - val_mean_squared_error: 0.1300
Epoch 580/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0907 - mean_absolute_error: 0.2031 - mean_squared_error: 0.0907 - val_loss: 0.0823 - val_mean_absolute_error: 0.1955 - val_mean_squared_error: 0.0823
Epoch 581/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0824 - mean_absolute_error: 0.1968 - mean_squared_error: 0.0824 - val_loss: 0.0717 - val_mean_absolute_error: 0.1763 - val_mean_squared_error: 0.0717
Epoch 582/1000
8000/8000 [==============================] - 1s 144us/step - loss: 0.0813 - mean_absolute_error: 0.1960 - mean_squared_error: 0.0813 - val_loss: 0.1137 - val_mean_absolute_error: 0.2262 - val_mean_squared_error: 0.1137
Epoch 583/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.1040 - mean_absolute_error: 0.2162 - mean_squared_error: 0.1040 - val_loss: 0.0780 - val_mean_absolute_error: 0.1813 - val_mean_squared_error: 0.0780
Epoch 584/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0743 - mean_absolute_error: 0.1868 - mean_squared_error: 0.0743 - val_loss: 0.0888 - val_mean_absolute_error: 0.2073 - val_mean_squared_error: 0.0888
Epoch 585/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0718 - mean_absolute_error: 0.1885 - mean_squared_error: 0.0718 - val_loss: 0.0868 - val_mean_absolute_error: 0.1990 - val_mean_squared_error: 0.0868
Epoch 586/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.1912 - mean_absolute_error: 0.2933 - mean_squared_error: 0.1912 - val_loss: 0.1460 - val_mean_absolute_error: 0.2519 - val_mean_squared_error: 0.1460
Epoch 587/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.1029 - mean_absolute_error: 0.2204 - mean_squared_error: 0.1029 - val_loss: 0.0826 - val_mean_absolute_error: 0.1912 - val_mean_squared_error: 0.0826
Epoch 588/1000
8000/8000 [==============================] - 1s 140us/step - loss: 0.0782 - mean_absolute_error: 0.1949 - mean_squared_error: 0.0782 - val_loss: 0.0811 - val_mean_absolute_error: 0.1970 - val_mean_squared_error: 0.0811
Epoch 589/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.1044 - mean_absolute_error: 0.2388 - mean_squared_error: 0.1044 - val_loss: 0.0856 - val_mean_absolute_error: 0.2053 - val_mean_squared_error: 0.0856
Epoch 590/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.0788 - mean_absolute_error: 0.1970 - mean_squared_error: 0.0788 - val_loss: 0.0815 - val_mean_absolute_error: 0.1908 - val_mean_squared_error: 0.0815
Epoch 591/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0845 - mean_absolute_error: 0.2078 - mean_squared_error: 0.0845 - val_loss: 0.0767 - val_mean_absolute_error: 0.1875 - val_mean_squared_error: 0.0767
Epoch 592/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0792 - mean_absolute_error: 0.1948 - mean_squared_error: 0.0792 - val_loss: 0.1422 - val_mean_absolute_error: 0.2668 - val_mean_squared_error: 0.1422
Epoch 593/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0928 - mean_absolute_error: 0.2092 - mean_squared_error: 0.0928 - val_loss: 0.0736 - val_mean_absolute_error: 0.1771 - val_mean_squared_error: 0.0736
Epoch 594/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0911 - mean_absolute_error: 0.2082 - mean_squared_error: 0.0911 - val_loss: 0.0823 - val_mean_absolute_error: 0.2003 - val_mean_squared_error: 0.0823
Epoch 595/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.1039 - mean_absolute_error: 0.2369 - mean_squared_error: 0.1039 - val_loss: 0.1338 - val_mean_absolute_error: 0.2487 - val_mean_squared_error: 0.1338
Epoch 596/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0871 - mean_absolute_error: 0.2084 - mean_squared_error: 0.0871 - val_loss: 0.0882 - val_mean_absolute_error: 0.2037 - val_mean_squared_error: 0.0882
Epoch 597/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.1292 - mean_absolute_error: 0.2536 - mean_squared_error: 0.1292 - val_loss: 0.0765 - val_mean_absolute_error: 0.1840 - val_mean_squared_error: 0.0765
Epoch 598/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0692 - mean_absolute_error: 0.1774 - mean_squared_error: 0.0692 - val_loss: 0.0818 - val_mean_absolute_error: 0.1903 - val_mean_squared_error: 0.0818
Epoch 599/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0898 - mean_absolute_error: 0.2074 - mean_squared_error: 0.0898 - val_loss: 0.1009 - val_mean_absolute_error: 0.2181 - val_mean_squared_error: 0.1009
Epoch 600/1000
8000/8000 [==============================] - 1s 135us/step - loss: 0.0756 - mean_absolute_error: 0.1888 - mean_squared_error: 0.0756 - val_loss: 0.1051 - val_mean_absolute_error: 0.2193 - val_mean_squared_error: 0.1051
Epoch 601/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.0874 - mean_absolute_error: 0.2003 - mean_squared_error: 0.0874 - val_loss: 0.0789 - val_mean_absolute_error: 0.1944 - val_mean_squared_error: 0.0789
Epoch 602/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0785 - mean_absolute_error: 0.1938 - mean_squared_error: 0.0785 - val_loss: 0.0776 - val_mean_absolute_error: 0.1881 - val_mean_squared_error: 0.0776
Epoch 603/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0757 - mean_absolute_error: 0.1887 - mean_squared_error: 0.0757 - val_loss: 0.1098 - val_mean_absolute_error: 0.2273 - val_mean_squared_error: 0.1098
Epoch 604/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.0719 - mean_absolute_error: 0.1874 - mean_squared_error: 0.0719 - val_loss: 0.0781 - val_mean_absolute_error: 0.1939 - val_mean_squared_error: 0.0781
Epoch 605/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.3002 - mean_absolute_error: 0.3252 - mean_squared_error: 0.3002 - val_loss: 0.2160 - val_mean_absolute_error: 0.3602 - val_mean_squared_error: 0.2160
Epoch 606/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.2396 - mean_absolute_error: 0.3379 - mean_squared_error: 0.2396 - val_loss: 0.1961 - val_mean_absolute_error: 0.3172 - val_mean_squared_error: 0.1961
Epoch 607/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.1434 - mean_absolute_error: 0.2475 - mean_squared_error: 0.1434 - val_loss: 0.1107 - val_mean_absolute_error: 0.2104 - val_mean_squared_error: 0.1107
Epoch 608/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.0934 - mean_absolute_error: 0.2051 - mean_squared_error: 0.0934 - val_loss: 0.0935 - val_mean_absolute_error: 0.2069 - val_mean_squared_error: 0.0935
Epoch 609/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0879 - mean_absolute_error: 0.2077 - mean_squared_error: 0.0879 - val_loss: 0.1304 - val_mean_absolute_error: 0.2788 - val_mean_squared_error: 0.1304
Epoch 610/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.1002 - mean_absolute_error: 0.2300 - mean_squared_error: 0.1002 - val_loss: 0.0921 - val_mean_absolute_error: 0.2053 - val_mean_squared_error: 0.0921
Epoch 611/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0870 - mean_absolute_error: 0.2109 - mean_squared_error: 0.0870 - val_loss: 0.0863 - val_mean_absolute_error: 0.1967 - val_mean_squared_error: 0.0863
Epoch 612/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.0857 - mean_absolute_error: 0.2108 - mean_squared_error: 0.0857 - val_loss: 0.0939 - val_mean_absolute_error: 0.2100 - val_mean_squared_error: 0.0939
Epoch 613/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0787 - mean_absolute_error: 0.1965 - mean_squared_error: 0.0787 - val_loss: 0.1048 - val_mean_absolute_error: 0.2319 - val_mean_squared_error: 0.1048
Epoch 614/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.0756 - mean_absolute_error: 0.1911 - mean_squared_error: 0.0756 - val_loss: 0.0775 - val_mean_absolute_error: 0.1848 - val_mean_squared_error: 0.0775
Epoch 615/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.1008 - mean_absolute_error: 0.2241 - mean_squared_error: 0.1008 - val_loss: 0.1245 - val_mean_absolute_error: 0.2578 - val_mean_squared_error: 0.1245
Epoch 616/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.0979 - mean_absolute_error: 0.2255 - mean_squared_error: 0.0979 - val_loss: 0.1045 - val_mean_absolute_error: 0.2330 - val_mean_squared_error: 0.1045
Epoch 617/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0737 - mean_absolute_error: 0.1911 - mean_squared_error: 0.0737 - val_loss: 0.0748 - val_mean_absolute_error: 0.1885 - val_mean_squared_error: 0.0748
Epoch 618/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.0704 - mean_absolute_error: 0.1889 - mean_squared_error: 0.0704 - val_loss: 0.0878 - val_mean_absolute_error: 0.2147 - val_mean_squared_error: 0.0878
Epoch 619/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.0798 - mean_absolute_error: 0.1969 - mean_squared_error: 0.0798 - val_loss: 0.1653 - val_mean_absolute_error: 0.2931 - val_mean_squared_error: 0.1653
Epoch 620/1000
8000/8000 [==============================] - 1s 135us/step - loss: 0.1016 - mean_absolute_error: 0.2198 - mean_squared_error: 0.1016 - val_loss: 0.1017 - val_mean_absolute_error: 0.2125 - val_mean_squared_error: 0.1017
Epoch 621/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0736 - mean_absolute_error: 0.1847 - mean_squared_error: 0.0736 - val_loss: 0.0703 - val_mean_absolute_error: 0.1750 - val_mean_squared_error: 0.0703
Epoch 622/1000
8000/8000 [==============================] - 1s 143us/step - loss: 0.0668 - mean_absolute_error: 0.1785 - mean_squared_error: 0.0668 - val_loss: 0.0799 - val_mean_absolute_error: 0.1879 - val_mean_squared_error: 0.0799
Epoch 623/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.1290 - mean_absolute_error: 0.2551 - mean_squared_error: 0.1290 - val_loss: 0.1118 - val_mean_absolute_error: 0.2438 - val_mean_squared_error: 0.1118
Epoch 624/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.0858 - mean_absolute_error: 0.2072 - mean_squared_error: 0.0858 - val_loss: 0.0998 - val_mean_absolute_error: 0.2090 - val_mean_squared_error: 0.0998
Epoch 625/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.0842 - mean_absolute_error: 0.1995 - mean_squared_error: 0.0842 - val_loss: 0.0713 - val_mean_absolute_error: 0.1825 - val_mean_squared_error: 0.0713
Epoch 626/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.0864 - mean_absolute_error: 0.2071 - mean_squared_error: 0.0864 - val_loss: 0.1120 - val_mean_absolute_error: 0.2231 - val_mean_squared_error: 0.1120
Epoch 627/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0984 - mean_absolute_error: 0.2110 - mean_squared_error: 0.0984 - val_loss: 0.0923 - val_mean_absolute_error: 0.2070 - val_mean_squared_error: 0.0923
Epoch 628/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0822 - mean_absolute_error: 0.1963 - mean_squared_error: 0.0822 - val_loss: 0.0918 - val_mean_absolute_error: 0.2081 - val_mean_squared_error: 0.0918
Epoch 629/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0972 - mean_absolute_error: 0.2162 - mean_squared_error: 0.0972 - val_loss: 0.0720 - val_mean_absolute_error: 0.1786 - val_mean_squared_error: 0.0720
Epoch 630/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0717 - mean_absolute_error: 0.1832 - mean_squared_error: 0.0717 - val_loss: 0.0734 - val_mean_absolute_error: 0.1816 - val_mean_squared_error: 0.0734
Epoch 631/1000
8000/8000 [==============================] - 1s 134us/step - loss: 0.0773 - mean_absolute_error: 0.1921 - mean_squared_error: 0.0773 - val_loss: 0.0785 - val_mean_absolute_error: 0.1892 - val_mean_squared_error: 0.0785
Epoch 632/1000
8000/8000 [==============================] - 1s 135us/step - loss: 0.0757 - mean_absolute_error: 0.1919 - mean_squared_error: 0.0757 - val_loss: 0.1074 - val_mean_absolute_error: 0.2261 - val_mean_squared_error: 0.1074
Epoch 633/1000
8000/8000 [==============================] - 1s 135us/step - loss: 0.0912 - mean_absolute_error: 0.2092 - mean_squared_error: 0.0912 - val_loss: 0.0741 - val_mean_absolute_error: 0.1858 - val_mean_squared_error: 0.0741
Epoch 634/1000
8000/8000 [==============================] - 1s 135us/step - loss: 0.1017 - mean_absolute_error: 0.2223 - mean_squared_error: 0.1017 - val_loss: 0.0736 - val_mean_absolute_error: 0.1775 - val_mean_squared_error: 0.0736
Epoch 635/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.0797 - mean_absolute_error: 0.1970 - mean_squared_error: 0.0797 - val_loss: 0.0704 - val_mean_absolute_error: 0.1770 - val_mean_squared_error: 0.0704
Epoch 636/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0737 - mean_absolute_error: 0.1920 - mean_squared_error: 0.0737 - val_loss: 0.0890 - val_mean_absolute_error: 0.2107 - val_mean_squared_error: 0.0890
Epoch 637/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.1227 - mean_absolute_error: 0.2415 - mean_squared_error: 0.1227 - val_loss: 0.0838 - val_mean_absolute_error: 0.1889 - val_mean_squared_error: 0.0838
Epoch 638/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.0779 - mean_absolute_error: 0.1935 - mean_squared_error: 0.0779 - val_loss: 0.0764 - val_mean_absolute_error: 0.1799 - val_mean_squared_error: 0.0764
Epoch 639/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.0745 - mean_absolute_error: 0.1899 - mean_squared_error: 0.0745 - val_loss: 0.1252 - val_mean_absolute_error: 0.2566 - val_mean_squared_error: 0.1252
Epoch 640/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.1006 - mean_absolute_error: 0.2258 - mean_squared_error: 0.1006 - val_loss: 0.0751 - val_mean_absolute_error: 0.1885 - val_mean_squared_error: 0.0751
Epoch 641/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.0880 - mean_absolute_error: 0.2076 - mean_squared_error: 0.0880 - val_loss: 0.0830 - val_mean_absolute_error: 0.1895 - val_mean_squared_error: 0.0830
Epoch 642/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0771 - mean_absolute_error: 0.1951 - mean_squared_error: 0.0771 - val_loss: 0.0957 - val_mean_absolute_error: 0.2160 - val_mean_squared_error: 0.0957
Epoch 643/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.1074 - mean_absolute_error: 0.2324 - mean_squared_error: 0.1074 - val_loss: 0.1452 - val_mean_absolute_error: 0.2645 - val_mean_squared_error: 0.1452
Epoch 644/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.1009 - mean_absolute_error: 0.2419 - mean_squared_error: 0.1009 - val_loss: 0.0969 - val_mean_absolute_error: 0.2132 - val_mean_squared_error: 0.0969
Epoch 645/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0921 - mean_absolute_error: 0.2197 - mean_squared_error: 0.0921 - val_loss: 0.0747 - val_mean_absolute_error: 0.1900 - val_mean_squared_error: 0.0747
Epoch 646/1000
8000/8000 [==============================] - 1s 140us/step - loss: 0.0724 - mean_absolute_error: 0.1883 - mean_squared_error: 0.0724 - val_loss: 0.0732 - val_mean_absolute_error: 0.1827 - val_mean_squared_error: 0.0732
Epoch 647/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0971 - mean_absolute_error: 0.2172 - mean_squared_error: 0.0971 - val_loss: 0.1007 - val_mean_absolute_error: 0.2075 - val_mean_squared_error: 0.1007
Epoch 648/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.0774 - mean_absolute_error: 0.1954 - mean_squared_error: 0.0774 - val_loss: 0.0802 - val_mean_absolute_error: 0.1944 - val_mean_squared_error: 0.0802
Epoch 649/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0667 - mean_absolute_error: 0.1801 - mean_squared_error: 0.0667 - val_loss: 0.0872 - val_mean_absolute_error: 0.2014 - val_mean_squared_error: 0.0872
Epoch 650/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.1192 - mean_absolute_error: 0.2344 - mean_squared_error: 0.1192 - val_loss: 0.0694 - val_mean_absolute_error: 0.1716 - val_mean_squared_error: 0.0694
Epoch 651/1000
8000/8000 [==============================] - 1s 142us/step - loss: 0.0682 - mean_absolute_error: 0.1778 - mean_squared_error: 0.0682 - val_loss: 0.0705 - val_mean_absolute_error: 0.1800 - val_mean_squared_error: 0.0705
Epoch 652/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0758 - mean_absolute_error: 0.1900 - mean_squared_error: 0.0758 - val_loss: 0.1158 - val_mean_absolute_error: 0.2461 - val_mean_squared_error: 0.1158
Epoch 653/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0926 - mean_absolute_error: 0.2149 - mean_squared_error: 0.0926 - val_loss: 0.0708 - val_mean_absolute_error: 0.1795 - val_mean_squared_error: 0.0708
Epoch 654/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0715 - mean_absolute_error: 0.1857 - mean_squared_error: 0.0715 - val_loss: 0.0773 - val_mean_absolute_error: 0.1909 - val_mean_squared_error: 0.0773
Epoch 655/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0778 - mean_absolute_error: 0.1949 - mean_squared_error: 0.0778 - val_loss: 0.0831 - val_mean_absolute_error: 0.1908 - val_mean_squared_error: 0.0831
Epoch 656/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0755 - mean_absolute_error: 0.1920 - mean_squared_error: 0.0755 - val_loss: 0.0860 - val_mean_absolute_error: 0.1962 - val_mean_squared_error: 0.0860
Epoch 657/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0890 - mean_absolute_error: 0.2027 - mean_squared_error: 0.0890 - val_loss: 0.0720 - val_mean_absolute_error: 0.1813 - val_mean_squared_error: 0.0720
Epoch 658/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0837 - mean_absolute_error: 0.1989 - mean_squared_error: 0.0837 - val_loss: 0.0847 - val_mean_absolute_error: 0.2077 - val_mean_squared_error: 0.0847
Epoch 659/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0787 - mean_absolute_error: 0.1958 - mean_squared_error: 0.0787 - val_loss: 0.0944 - val_mean_absolute_error: 0.2099 - val_mean_squared_error: 0.0944
Epoch 660/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.0847 - mean_absolute_error: 0.2043 - mean_squared_error: 0.0847 - val_loss: 0.1535 - val_mean_absolute_error: 0.2907 - val_mean_squared_error: 0.1535
Epoch 661/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.1046 - mean_absolute_error: 0.2356 - mean_squared_error: 0.1046 - val_loss: 0.1258 - val_mean_absolute_error: 0.2391 - val_mean_squared_error: 0.1258
Epoch 662/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.0894 - mean_absolute_error: 0.2094 - mean_squared_error: 0.0894 - val_loss: 0.0887 - val_mean_absolute_error: 0.2016 - val_mean_squared_error: 0.0887
Epoch 663/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0799 - mean_absolute_error: 0.2014 - mean_squared_error: 0.0799 - val_loss: 0.0842 - val_mean_absolute_error: 0.2046 - val_mean_squared_error: 0.0842
Epoch 664/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0806 - mean_absolute_error: 0.1960 - mean_squared_error: 0.0806 - val_loss: 0.0675 - val_mean_absolute_error: 0.1723 - val_mean_squared_error: 0.0675
Epoch 665/1000
8000/8000 [==============================] - 1s 140us/step - loss: 0.0653 - mean_absolute_error: 0.1749 - mean_squared_error: 0.0653 - val_loss: 0.0659 - val_mean_absolute_error: 0.1689 - val_mean_squared_error: 0.0659
Epoch 666/1000
8000/8000 [==============================] - 1s 141us/step - loss: 0.1387 - mean_absolute_error: 0.2389 - mean_squared_error: 0.1387 - val_loss: 0.2115 - val_mean_absolute_error: 0.3240 - val_mean_squared_error: 0.2115
Epoch 667/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.1095 - mean_absolute_error: 0.2304 - mean_squared_error: 0.1095 - val_loss: 0.0794 - val_mean_absolute_error: 0.1791 - val_mean_squared_error: 0.0794
Epoch 668/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0756 - mean_absolute_error: 0.1898 - mean_squared_error: 0.0756 - val_loss: 0.0779 - val_mean_absolute_error: 0.1949 - val_mean_squared_error: 0.0779
Epoch 669/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0655 - mean_absolute_error: 0.1763 - mean_squared_error: 0.0655 - val_loss: 0.0750 - val_mean_absolute_error: 0.1831 - val_mean_squared_error: 0.0750
Epoch 670/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0784 - mean_absolute_error: 0.1941 - mean_squared_error: 0.0784 - val_loss: 0.0728 - val_mean_absolute_error: 0.1845 - val_mean_squared_error: 0.0728
Epoch 671/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0667 - mean_absolute_error: 0.1795 - mean_squared_error: 0.0667 - val_loss: 0.0687 - val_mean_absolute_error: 0.1752 - val_mean_squared_error: 0.0687
Epoch 672/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0724 - mean_absolute_error: 0.1852 - mean_squared_error: 0.0724 - val_loss: 0.0835 - val_mean_absolute_error: 0.2021 - val_mean_squared_error: 0.0835
Epoch 673/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.1503 - mean_absolute_error: 0.2625 - mean_squared_error: 0.1503 - val_loss: 0.1177 - val_mean_absolute_error: 0.2230 - val_mean_squared_error: 0.1177
Epoch 674/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.1097 - mean_absolute_error: 0.2266 - mean_squared_error: 0.1097 - val_loss: 0.1405 - val_mean_absolute_error: 0.2745 - val_mean_squared_error: 0.1405
Epoch 675/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0905 - mean_absolute_error: 0.2157 - mean_squared_error: 0.0905 - val_loss: 0.0825 - val_mean_absolute_error: 0.2073 - val_mean_squared_error: 0.0825
Epoch 676/1000
8000/8000 [==============================] - 1s 135us/step - loss: 0.0730 - mean_absolute_error: 0.1905 - mean_squared_error: 0.0730 - val_loss: 0.0763 - val_mean_absolute_error: 0.1925 - val_mean_squared_error: 0.0763
Epoch 677/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.0658 - mean_absolute_error: 0.1757 - mean_squared_error: 0.0658 - val_loss: 0.0677 - val_mean_absolute_error: 0.1742 - val_mean_squared_error: 0.0677
Epoch 678/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0886 - mean_absolute_error: 0.2047 - mean_squared_error: 0.0886 - val_loss: 0.1944 - val_mean_absolute_error: 0.3544 - val_mean_squared_error: 0.1944
Epoch 679/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.1088 - mean_absolute_error: 0.2386 - mean_squared_error: 0.1088 - val_loss: 0.1072 - val_mean_absolute_error: 0.2413 - val_mean_squared_error: 0.1072
Epoch 680/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.0807 - mean_absolute_error: 0.2049 - mean_squared_error: 0.0807 - val_loss: 0.0860 - val_mean_absolute_error: 0.2011 - val_mean_squared_error: 0.0860
Epoch 681/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.0917 - mean_absolute_error: 0.2120 - mean_squared_error: 0.0917 - val_loss: 0.0841 - val_mean_absolute_error: 0.2064 - val_mean_squared_error: 0.0841
Epoch 682/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.1005 - mean_absolute_error: 0.2233 - mean_squared_error: 0.1005 - val_loss: 0.0717 - val_mean_absolute_error: 0.1833 - val_mean_squared_error: 0.0717
Epoch 683/1000
8000/8000 [==============================] - 1s 141us/step - loss: 0.0733 - mean_absolute_error: 0.1915 - mean_squared_error: 0.0733 - val_loss: 0.0795 - val_mean_absolute_error: 0.2022 - val_mean_squared_error: 0.0795
Epoch 684/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.1057 - mean_absolute_error: 0.2365 - mean_squared_error: 0.1057 - val_loss: 0.0790 - val_mean_absolute_error: 0.1955 - val_mean_squared_error: 0.0790
Epoch 685/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.0758 - mean_absolute_error: 0.1964 - mean_squared_error: 0.0758 - val_loss: 0.0885 - val_mean_absolute_error: 0.2196 - val_mean_squared_error: 0.0885
Epoch 686/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.0755 - mean_absolute_error: 0.1941 - mean_squared_error: 0.0755 - val_loss: 0.0691 - val_mean_absolute_error: 0.1769 - val_mean_squared_error: 0.0691
Epoch 687/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0870 - mean_absolute_error: 0.2009 - mean_squared_error: 0.0870 - val_loss: 0.0737 - val_mean_absolute_error: 0.1803 - val_mean_squared_error: 0.0737
Epoch 688/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0712 - mean_absolute_error: 0.1849 - mean_squared_error: 0.0712 - val_loss: 0.0934 - val_mean_absolute_error: 0.2213 - val_mean_squared_error: 0.0934
Epoch 689/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0712 - mean_absolute_error: 0.1842 - mean_squared_error: 0.0712 - val_loss: 0.0664 - val_mean_absolute_error: 0.1731 - val_mean_squared_error: 0.0664
Epoch 690/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.0620 - mean_absolute_error: 0.1715 - mean_squared_error: 0.0620 - val_loss: 0.0717 - val_mean_absolute_error: 0.1805 - val_mean_squared_error: 0.0717
Epoch 691/1000
8000/8000 [==============================] - 1s 135us/step - loss: 0.1952 - mean_absolute_error: 0.2963 - mean_squared_error: 0.1952 - val_loss: 0.1902 - val_mean_absolute_error: 0.3053 - val_mean_squared_error: 0.1902
Epoch 692/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0952 - mean_absolute_error: 0.2083 - mean_squared_error: 0.0952 - val_loss: 0.0819 - val_mean_absolute_error: 0.1932 - val_mean_squared_error: 0.0819
Epoch 693/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0733 - mean_absolute_error: 0.1867 - mean_squared_error: 0.0733 - val_loss: 0.0726 - val_mean_absolute_error: 0.1845 - val_mean_squared_error: 0.0726
Epoch 694/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0764 - mean_absolute_error: 0.1927 - mean_squared_error: 0.0764 - val_loss: 0.0765 - val_mean_absolute_error: 0.1942 - val_mean_squared_error: 0.0765
Epoch 695/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0730 - mean_absolute_error: 0.1927 - mean_squared_error: 0.0730 - val_loss: 0.0691 - val_mean_absolute_error: 0.1826 - val_mean_squared_error: 0.0691
Epoch 696/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.0722 - mean_absolute_error: 0.1869 - mean_squared_error: 0.0722 - val_loss: 0.0726 - val_mean_absolute_error: 0.1835 - val_mean_squared_error: 0.0726
Epoch 697/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0657 - mean_absolute_error: 0.1811 - mean_squared_error: 0.0657 - val_loss: 0.1169 - val_mean_absolute_error: 0.2523 - val_mean_squared_error: 0.1169
Epoch 698/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.1133 - mean_absolute_error: 0.2371 - mean_squared_error: 0.1133 - val_loss: 0.1065 - val_mean_absolute_error: 0.2368 - val_mean_squared_error: 0.1065
Epoch 699/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.0750 - mean_absolute_error: 0.1951 - mean_squared_error: 0.0750 - val_loss: 0.0725 - val_mean_absolute_error: 0.1831 - val_mean_squared_error: 0.0725
Epoch 700/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0819 - mean_absolute_error: 0.2045 - mean_squared_error: 0.0819 - val_loss: 0.0684 - val_mean_absolute_error: 0.1787 - val_mean_squared_error: 0.0684
Epoch 701/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0847 - mean_absolute_error: 0.2022 - mean_squared_error: 0.0847 - val_loss: 0.0792 - val_mean_absolute_error: 0.1895 - val_mean_squared_error: 0.0792
Epoch 702/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.0802 - mean_absolute_error: 0.2012 - mean_squared_error: 0.0802 - val_loss: 0.0711 - val_mean_absolute_error: 0.1828 - val_mean_squared_error: 0.0711
Epoch 703/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0678 - mean_absolute_error: 0.1798 - mean_squared_error: 0.0678 - val_loss: 0.0715 - val_mean_absolute_error: 0.1774 - val_mean_squared_error: 0.0715
Epoch 704/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0719 - mean_absolute_error: 0.1831 - mean_squared_error: 0.0719 - val_loss: 0.0740 - val_mean_absolute_error: 0.1816 - val_mean_squared_error: 0.0740
Epoch 705/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.0793 - mean_absolute_error: 0.1904 - mean_squared_error: 0.0793 - val_loss: 0.0736 - val_mean_absolute_error: 0.1843 - val_mean_squared_error: 0.0736
Epoch 706/1000
8000/8000 [==============================] - 1s 140us/step - loss: 0.0605 - mean_absolute_error: 0.1687 - mean_squared_error: 0.0605 - val_loss: 0.0914 - val_mean_absolute_error: 0.2007 - val_mean_squared_error: 0.0914
Epoch 707/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.0894 - mean_absolute_error: 0.2039 - mean_squared_error: 0.0894 - val_loss: 0.0707 - val_mean_absolute_error: 0.1749 - val_mean_squared_error: 0.0707
Epoch 708/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0829 - mean_absolute_error: 0.1999 - mean_squared_error: 0.0829 - val_loss: 0.0997 - val_mean_absolute_error: 0.2281 - val_mean_squared_error: 0.0997
Epoch 709/1000
8000/8000 [==============================] - 1s 140us/step - loss: 0.1185 - mean_absolute_error: 0.2525 - mean_squared_error: 0.1185 - val_loss: 0.0854 - val_mean_absolute_error: 0.2062 - val_mean_squared_error: 0.0854
Epoch 710/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0722 - mean_absolute_error: 0.1948 - mean_squared_error: 0.0722 - val_loss: 0.1277 - val_mean_absolute_error: 0.2658 - val_mean_squared_error: 0.1277
Epoch 711/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0917 - mean_absolute_error: 0.2187 - mean_squared_error: 0.0917 - val_loss: 0.0830 - val_mean_absolute_error: 0.2037 - val_mean_squared_error: 0.0830
Epoch 712/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0705 - mean_absolute_error: 0.1843 - mean_squared_error: 0.0705 - val_loss: 0.0823 - val_mean_absolute_error: 0.1996 - val_mean_squared_error: 0.0823
Epoch 713/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0983 - mean_absolute_error: 0.2177 - mean_squared_error: 0.0983 - val_loss: 0.0706 - val_mean_absolute_error: 0.1860 - val_mean_squared_error: 0.0706
Epoch 714/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.0608 - mean_absolute_error: 0.1727 - mean_squared_error: 0.0608 - val_loss: 0.0967 - val_mean_absolute_error: 0.2146 - val_mean_squared_error: 0.0967
Epoch 715/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.0883 - mean_absolute_error: 0.2113 - mean_squared_error: 0.0883 - val_loss: 0.0913 - val_mean_absolute_error: 0.2037 - val_mean_squared_error: 0.0913
Epoch 716/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0700 - mean_absolute_error: 0.1812 - mean_squared_error: 0.0700 - val_loss: 0.0796 - val_mean_absolute_error: 0.1884 - val_mean_squared_error: 0.0796
Epoch 717/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0747 - mean_absolute_error: 0.1884 - mean_squared_error: 0.0747 - val_loss: 0.0926 - val_mean_absolute_error: 0.2090 - val_mean_squared_error: 0.0926
Epoch 718/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.0607 - mean_absolute_error: 0.1673 - mean_squared_error: 0.0607 - val_loss: 0.0855 - val_mean_absolute_error: 0.1959 - val_mean_squared_error: 0.0855
Epoch 719/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.0683 - mean_absolute_error: 0.1788 - mean_squared_error: 0.0683 - val_loss: 0.0760 - val_mean_absolute_error: 0.1884 - val_mean_squared_error: 0.0760
Epoch 720/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.1350 - mean_absolute_error: 0.2426 - mean_squared_error: 0.1350 - val_loss: 0.1124 - val_mean_absolute_error: 0.2277 - val_mean_squared_error: 0.1124
Epoch 721/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0998 - mean_absolute_error: 0.2176 - mean_squared_error: 0.0998 - val_loss: 0.1000 - val_mean_absolute_error: 0.2178 - val_mean_squared_error: 0.1000
Epoch 722/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0708 - mean_absolute_error: 0.1874 - mean_squared_error: 0.0708 - val_loss: 0.0685 - val_mean_absolute_error: 0.1776 - val_mean_squared_error: 0.0685
Epoch 723/1000
8000/8000 [==============================] - 1s 141us/step - loss: 0.0717 - mean_absolute_error: 0.1918 - mean_squared_error: 0.0717 - val_loss: 0.1001 - val_mean_absolute_error: 0.2518 - val_mean_squared_error: 0.1001
Epoch 724/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0869 - mean_absolute_error: 0.2151 - mean_squared_error: 0.0869 - val_loss: 0.0828 - val_mean_absolute_error: 0.1985 - val_mean_squared_error: 0.0828
Epoch 725/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0716 - mean_absolute_error: 0.1918 - mean_squared_error: 0.0716 - val_loss: 0.0689 - val_mean_absolute_error: 0.1773 - val_mean_squared_error: 0.0689
Epoch 726/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0762 - mean_absolute_error: 0.1956 - mean_squared_error: 0.0762 - val_loss: 0.0685 - val_mean_absolute_error: 0.1746 - val_mean_squared_error: 0.0685
Epoch 727/1000
8000/8000 [==============================] - 1s 140us/step - loss: 0.0721 - mean_absolute_error: 0.1838 - mean_squared_error: 0.0721 - val_loss: 0.0737 - val_mean_absolute_error: 0.1868 - val_mean_squared_error: 0.0737
Epoch 728/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.1021 - mean_absolute_error: 0.2136 - mean_squared_error: 0.1021 - val_loss: 0.1445 - val_mean_absolute_error: 0.2550 - val_mean_squared_error: 0.1445
Epoch 729/1000
8000/8000 [==============================] - 1s 135us/step - loss: 0.0845 - mean_absolute_error: 0.2003 - mean_squared_error: 0.0845 - val_loss: 0.0691 - val_mean_absolute_error: 0.1756 - val_mean_squared_error: 0.0691
Epoch 730/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0698 - mean_absolute_error: 0.1866 - mean_squared_error: 0.0698 - val_loss: 0.0945 - val_mean_absolute_error: 0.2324 - val_mean_squared_error: 0.0945
Epoch 731/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0747 - mean_absolute_error: 0.1995 - mean_squared_error: 0.0747 - val_loss: 0.0655 - val_mean_absolute_error: 0.1693 - val_mean_squared_error: 0.0655
Epoch 732/1000
8000/8000 [==============================] - 1s 143us/step - loss: 0.0811 - mean_absolute_error: 0.1989 - mean_squared_error: 0.0811 - val_loss: 0.1043 - val_mean_absolute_error: 0.2237 - val_mean_squared_error: 0.1043
Epoch 733/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0702 - mean_absolute_error: 0.1885 - mean_squared_error: 0.0702 - val_loss: 0.0717 - val_mean_absolute_error: 0.1871 - val_mean_squared_error: 0.0717
Epoch 734/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0679 - mean_absolute_error: 0.1806 - mean_squared_error: 0.0679 - val_loss: 0.0685 - val_mean_absolute_error: 0.1827 - val_mean_squared_error: 0.0685
Epoch 735/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0801 - mean_absolute_error: 0.1946 - mean_squared_error: 0.0801 - val_loss: 0.2424 - val_mean_absolute_error: 0.3502 - val_mean_squared_error: 0.2424
Epoch 736/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.1103 - mean_absolute_error: 0.2281 - mean_squared_error: 0.1103 - val_loss: 0.0893 - val_mean_absolute_error: 0.2162 - val_mean_squared_error: 0.0893
Epoch 737/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.0647 - mean_absolute_error: 0.1761 - mean_squared_error: 0.0647 - val_loss: 0.0657 - val_mean_absolute_error: 0.1727 - val_mean_squared_error: 0.0657
Epoch 738/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0706 - mean_absolute_error: 0.1861 - mean_squared_error: 0.0706 - val_loss: 0.1099 - val_mean_absolute_error: 0.2482 - val_mean_squared_error: 0.1099
Epoch 739/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0779 - mean_absolute_error: 0.2029 - mean_squared_error: 0.0779 - val_loss: 0.0777 - val_mean_absolute_error: 0.1924 - val_mean_squared_error: 0.0777
Epoch 740/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.0693 - mean_absolute_error: 0.1893 - mean_squared_error: 0.0693 - val_loss: 0.0783 - val_mean_absolute_error: 0.2001 - val_mean_squared_error: 0.0783
Epoch 741/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.0807 - mean_absolute_error: 0.2088 - mean_squared_error: 0.0807 - val_loss: 0.0623 - val_mean_absolute_error: 0.1650 - val_mean_squared_error: 0.0623
Epoch 742/1000
8000/8000 [==============================] - 1s 142us/step - loss: 0.1221 - mean_absolute_error: 0.2181 - mean_squared_error: 0.1221 - val_loss: 0.5597 - val_mean_absolute_error: 0.5117 - val_mean_squared_error: 0.5597
Epoch 743/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.1502 - mean_absolute_error: 0.2492 - mean_squared_error: 0.1502 - val_loss: 0.0902 - val_mean_absolute_error: 0.1933 - val_mean_squared_error: 0.0902
Epoch 744/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0750 - mean_absolute_error: 0.1843 - mean_squared_error: 0.0750 - val_loss: 0.0707 - val_mean_absolute_error: 0.1716 - val_mean_squared_error: 0.0707
Epoch 745/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0695 - mean_absolute_error: 0.1874 - mean_squared_error: 0.0695 - val_loss: 0.0727 - val_mean_absolute_error: 0.1891 - val_mean_squared_error: 0.0727
Epoch 746/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0766 - mean_absolute_error: 0.1987 - mean_squared_error: 0.0766 - val_loss: 0.0951 - val_mean_absolute_error: 0.2196 - val_mean_squared_error: 0.0951
Epoch 747/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0718 - mean_absolute_error: 0.1918 - mean_squared_error: 0.0718 - val_loss: 0.0826 - val_mean_absolute_error: 0.2126 - val_mean_squared_error: 0.0826
Epoch 748/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0792 - mean_absolute_error: 0.2013 - mean_squared_error: 0.0792 - val_loss: 0.0654 - val_mean_absolute_error: 0.1688 - val_mean_squared_error: 0.0654
Epoch 749/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.0694 - mean_absolute_error: 0.1784 - mean_squared_error: 0.0694 - val_loss: 0.1178 - val_mean_absolute_error: 0.2375 - val_mean_squared_error: 0.1178
Epoch 750/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0725 - mean_absolute_error: 0.1909 - mean_squared_error: 0.0725 - val_loss: 0.0722 - val_mean_absolute_error: 0.1886 - val_mean_squared_error: 0.0722
Epoch 751/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.0644 - mean_absolute_error: 0.1783 - mean_squared_error: 0.0644 - val_loss: 0.0779 - val_mean_absolute_error: 0.1859 - val_mean_squared_error: 0.0779
Epoch 752/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0711 - mean_absolute_error: 0.1871 - mean_squared_error: 0.0711 - val_loss: 0.2015 - val_mean_absolute_error: 0.3218 - val_mean_squared_error: 0.2015
Epoch 753/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.1109 - mean_absolute_error: 0.2238 - mean_squared_error: 0.1109 - val_loss: 0.0644 - val_mean_absolute_error: 0.1642 - val_mean_squared_error: 0.0644
Epoch 754/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0639 - mean_absolute_error: 0.1706 - mean_squared_error: 0.0639 - val_loss: 0.0821 - val_mean_absolute_error: 0.1904 - val_mean_squared_error: 0.0821
Epoch 755/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0728 - mean_absolute_error: 0.1905 - mean_squared_error: 0.0728 - val_loss: 0.0654 - val_mean_absolute_error: 0.1752 - val_mean_squared_error: 0.0654
Epoch 756/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0681 - mean_absolute_error: 0.1843 - mean_squared_error: 0.0681 - val_loss: 0.1095 - val_mean_absolute_error: 0.2371 - val_mean_squared_error: 0.1095
Epoch 757/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0648 - mean_absolute_error: 0.1782 - mean_squared_error: 0.0648 - val_loss: 0.0675 - val_mean_absolute_error: 0.1796 - val_mean_squared_error: 0.0675
Epoch 758/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0699 - mean_absolute_error: 0.1821 - mean_squared_error: 0.0699 - val_loss: 0.0963 - val_mean_absolute_error: 0.2067 - val_mean_squared_error: 0.0963
Epoch 759/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0933 - mean_absolute_error: 0.2112 - mean_squared_error: 0.0933 - val_loss: 0.0887 - val_mean_absolute_error: 0.2090 - val_mean_squared_error: 0.0887
Epoch 760/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.1028 - mean_absolute_error: 0.2246 - mean_squared_error: 0.1028 - val_loss: 0.0848 - val_mean_absolute_error: 0.2031 - val_mean_squared_error: 0.0848
Epoch 761/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0827 - mean_absolute_error: 0.1985 - mean_squared_error: 0.0827 - val_loss: 0.0776 - val_mean_absolute_error: 0.1928 - val_mean_squared_error: 0.0776
Epoch 762/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0685 - mean_absolute_error: 0.1847 - mean_squared_error: 0.0685 - val_loss: 0.1286 - val_mean_absolute_error: 0.2688 - val_mean_squared_error: 0.1286
Epoch 763/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0713 - mean_absolute_error: 0.1890 - mean_squared_error: 0.0713 - val_loss: 0.0626 - val_mean_absolute_error: 0.1657 - val_mean_squared_error: 0.0626
Epoch 764/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0616 - mean_absolute_error: 0.1709 - mean_squared_error: 0.0616 - val_loss: 0.0606 - val_mean_absolute_error: 0.1663 - val_mean_squared_error: 0.0606
Epoch 765/1000
8000/8000 [==============================] - 1s 141us/step - loss: 0.0584 - mean_absolute_error: 0.1675 - mean_squared_error: 0.0584 - val_loss: 0.0697 - val_mean_absolute_error: 0.1784 - val_mean_squared_error: 0.0697
Epoch 766/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.1802 - mean_absolute_error: 0.2731 - mean_squared_error: 0.1802 - val_loss: 0.1153 - val_mean_absolute_error: 0.2311 - val_mean_squared_error: 0.1153
Epoch 767/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.1306 - mean_absolute_error: 0.2402 - mean_squared_error: 0.1306 - val_loss: 0.0943 - val_mean_absolute_error: 0.2001 - val_mean_squared_error: 0.0943
Epoch 768/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.0836 - mean_absolute_error: 0.1985 - mean_squared_error: 0.0836 - val_loss: 0.0717 - val_mean_absolute_error: 0.1753 - val_mean_squared_error: 0.0717
Epoch 769/1000
8000/8000 [==============================] - 1s 135us/step - loss: 0.0674 - mean_absolute_error: 0.1802 - mean_squared_error: 0.0674 - val_loss: 0.0826 - val_mean_absolute_error: 0.2073 - val_mean_squared_error: 0.0826
Epoch 770/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0706 - mean_absolute_error: 0.1924 - mean_squared_error: 0.0706 - val_loss: 0.0759 - val_mean_absolute_error: 0.2007 - val_mean_squared_error: 0.0759
Epoch 771/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0705 - mean_absolute_error: 0.1924 - mean_squared_error: 0.0705 - val_loss: 0.1110 - val_mean_absolute_error: 0.2598 - val_mean_squared_error: 0.1110
Epoch 772/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0837 - mean_absolute_error: 0.2121 - mean_squared_error: 0.0837 - val_loss: 0.0911 - val_mean_absolute_error: 0.2292 - val_mean_squared_error: 0.0911
Epoch 773/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.0656 - mean_absolute_error: 0.1853 - mean_squared_error: 0.0656 - val_loss: 0.0686 - val_mean_absolute_error: 0.1764 - val_mean_squared_error: 0.0686
Epoch 774/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0701 - mean_absolute_error: 0.1862 - mean_squared_error: 0.0701 - val_loss: 0.0711 - val_mean_absolute_error: 0.1779 - val_mean_squared_error: 0.0711
Epoch 775/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.0725 - mean_absolute_error: 0.1905 - mean_squared_error: 0.0725 - val_loss: 0.1127 - val_mean_absolute_error: 0.2385 - val_mean_squared_error: 0.1127
Epoch 776/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0792 - mean_absolute_error: 0.1980 - mean_squared_error: 0.0792 - val_loss: 0.0774 - val_mean_absolute_error: 0.1841 - val_mean_squared_error: 0.0774
Epoch 777/1000
8000/8000 [==============================] - 1s 135us/step - loss: 0.0638 - mean_absolute_error: 0.1746 - mean_squared_error: 0.0638 - val_loss: 0.0909 - val_mean_absolute_error: 0.2119 - val_mean_squared_error: 0.0909
Epoch 778/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.0720 - mean_absolute_error: 0.1843 - mean_squared_error: 0.0720 - val_loss: 0.0663 - val_mean_absolute_error: 0.1709 - val_mean_squared_error: 0.0663
Epoch 779/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.0983 - mean_absolute_error: 0.2245 - mean_squared_error: 0.0983 - val_loss: 0.0844 - val_mean_absolute_error: 0.2185 - val_mean_squared_error: 0.0844
Epoch 780/1000
8000/8000 [==============================] - 1s 134us/step - loss: 0.0743 - mean_absolute_error: 0.2007 - mean_squared_error: 0.0743 - val_loss: 0.0676 - val_mean_absolute_error: 0.1843 - val_mean_squared_error: 0.0676
Epoch 781/1000
8000/8000 [==============================] - 1s 135us/step - loss: 0.0610 - mean_absolute_error: 0.1780 - mean_squared_error: 0.0610 - val_loss: 0.0769 - val_mean_absolute_error: 0.2063 - val_mean_squared_error: 0.0769
Epoch 782/1000
8000/8000 [==============================] - 1s 135us/step - loss: 0.1660 - mean_absolute_error: 0.2774 - mean_squared_error: 0.1660 - val_loss: 0.1194 - val_mean_absolute_error: 0.2409 - val_mean_squared_error: 0.1194
Epoch 783/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.0743 - mean_absolute_error: 0.1877 - mean_squared_error: 0.0743 - val_loss: 0.0723 - val_mean_absolute_error: 0.1770 - val_mean_squared_error: 0.0723
Epoch 784/1000
8000/8000 [==============================] - 1s 135us/step - loss: 0.0604 - mean_absolute_error: 0.1684 - mean_squared_error: 0.0604 - val_loss: 0.0823 - val_mean_absolute_error: 0.1916 - val_mean_squared_error: 0.0823
Epoch 785/1000
8000/8000 [==============================] - 1s 135us/step - loss: 0.0755 - mean_absolute_error: 0.1954 - mean_squared_error: 0.0755 - val_loss: 0.0602 - val_mean_absolute_error: 0.1620 - val_mean_squared_error: 0.0602
Epoch 786/1000
8000/8000 [==============================] - 1s 145us/step - loss: 0.0730 - mean_absolute_error: 0.1882 - mean_squared_error: 0.0730 - val_loss: 0.0662 - val_mean_absolute_error: 0.1799 - val_mean_squared_error: 0.0662
Epoch 787/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0613 - mean_absolute_error: 0.1734 - mean_squared_error: 0.0613 - val_loss: 0.0605 - val_mean_absolute_error: 0.1616 - val_mean_squared_error: 0.0605
Epoch 788/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.0758 - mean_absolute_error: 0.1867 - mean_squared_error: 0.0758 - val_loss: 0.1928 - val_mean_absolute_error: 0.3120 - val_mean_squared_error: 0.1928
Epoch 789/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0915 - mean_absolute_error: 0.2015 - mean_squared_error: 0.0915 - val_loss: 0.0735 - val_mean_absolute_error: 0.1757 - val_mean_squared_error: 0.0735
Epoch 790/1000
8000/8000 [==============================] - 1s 140us/step - loss: 0.0658 - mean_absolute_error: 0.1755 - mean_squared_error: 0.0658 - val_loss: 0.0697 - val_mean_absolute_error: 0.1834 - val_mean_squared_error: 0.0697
Epoch 791/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0687 - mean_absolute_error: 0.1851 - mean_squared_error: 0.0687 - val_loss: 0.0728 - val_mean_absolute_error: 0.1797 - val_mean_squared_error: 0.0728
Epoch 792/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.0677 - mean_absolute_error: 0.1798 - mean_squared_error: 0.0677 - val_loss: 0.0612 - val_mean_absolute_error: 0.1661 - val_mean_squared_error: 0.0612
Epoch 793/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0611 - mean_absolute_error: 0.1718 - mean_squared_error: 0.0611 - val_loss: 0.0623 - val_mean_absolute_error: 0.1689 - val_mean_squared_error: 0.0623
Epoch 794/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.1048 - mean_absolute_error: 0.2171 - mean_squared_error: 0.1048 - val_loss: 0.0703 - val_mean_absolute_error: 0.1783 - val_mean_squared_error: 0.0703
Epoch 795/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0618 - mean_absolute_error: 0.1691 - mean_squared_error: 0.0618 - val_loss: 0.0730 - val_mean_absolute_error: 0.1811 - val_mean_squared_error: 0.0730
Epoch 796/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0599 - mean_absolute_error: 0.1683 - mean_squared_error: 0.0599 - val_loss: 0.0727 - val_mean_absolute_error: 0.1869 - val_mean_squared_error: 0.0727
Epoch 797/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0684 - mean_absolute_error: 0.1832 - mean_squared_error: 0.0684 - val_loss: 0.0792 - val_mean_absolute_error: 0.1958 - val_mean_squared_error: 0.0792
Epoch 798/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.0653 - mean_absolute_error: 0.1773 - mean_squared_error: 0.0653 - val_loss: 0.0712 - val_mean_absolute_error: 0.1866 - val_mean_squared_error: 0.0712
Epoch 799/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0663 - mean_absolute_error: 0.1772 - mean_squared_error: 0.0663 - val_loss: 0.0687 - val_mean_absolute_error: 0.1756 - val_mean_squared_error: 0.0687
Epoch 800/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0630 - mean_absolute_error: 0.1781 - mean_squared_error: 0.0630 - val_loss: 0.0881 - val_mean_absolute_error: 0.2023 - val_mean_squared_error: 0.0881
Epoch 801/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0878 - mean_absolute_error: 0.2070 - mean_squared_error: 0.0878 - val_loss: 0.0815 - val_mean_absolute_error: 0.1932 - val_mean_squared_error: 0.0815
Epoch 802/1000
8000/8000 [==============================] - 1s 140us/step - loss: 0.0661 - mean_absolute_error: 0.1800 - mean_squared_error: 0.0661 - val_loss: 0.0654 - val_mean_absolute_error: 0.1779 - val_mean_squared_error: 0.0654
Epoch 803/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0626 - mean_absolute_error: 0.1765 - mean_squared_error: 0.0626 - val_loss: 0.0663 - val_mean_absolute_error: 0.1818 - val_mean_squared_error: 0.0663
Epoch 804/1000
8000/8000 [==============================] - 1s 140us/step - loss: 0.0747 - mean_absolute_error: 0.1935 - mean_squared_error: 0.0747 - val_loss: 0.1026 - val_mean_absolute_error: 0.2198 - val_mean_squared_error: 0.1026
Epoch 805/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0854 - mean_absolute_error: 0.2006 - mean_squared_error: 0.0854 - val_loss: 0.0763 - val_mean_absolute_error: 0.1869 - val_mean_squared_error: 0.0763
Epoch 806/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0677 - mean_absolute_error: 0.1821 - mean_squared_error: 0.0677 - val_loss: 0.0678 - val_mean_absolute_error: 0.1794 - val_mean_squared_error: 0.0678
Epoch 807/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0633 - mean_absolute_error: 0.1746 - mean_squared_error: 0.0633 - val_loss: 0.0979 - val_mean_absolute_error: 0.2210 - val_mean_squared_error: 0.0979
Epoch 808/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.0635 - mean_absolute_error: 0.1730 - mean_squared_error: 0.0635 - val_loss: 0.0613 - val_mean_absolute_error: 0.1679 - val_mean_squared_error: 0.0613
Epoch 809/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.0866 - mean_absolute_error: 0.2002 - mean_squared_error: 0.0866 - val_loss: 0.0638 - val_mean_absolute_error: 0.1668 - val_mean_squared_error: 0.0638
Epoch 810/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0850 - mean_absolute_error: 0.1987 - mean_squared_error: 0.0850 - val_loss: 0.1445 - val_mean_absolute_error: 0.2618 - val_mean_squared_error: 0.1445
Epoch 811/1000
8000/8000 [==============================] - 1s 140us/step - loss: 0.1002 - mean_absolute_error: 0.2202 - mean_squared_error: 0.1002 - val_loss: 0.0882 - val_mean_absolute_error: 0.2081 - val_mean_squared_error: 0.0882
Epoch 812/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0666 - mean_absolute_error: 0.1833 - mean_squared_error: 0.0666 - val_loss: 0.0657 - val_mean_absolute_error: 0.1799 - val_mean_squared_error: 0.0657
Epoch 813/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0687 - mean_absolute_error: 0.1870 - mean_squared_error: 0.0687 - val_loss: 0.0731 - val_mean_absolute_error: 0.1784 - val_mean_squared_error: 0.0731
Epoch 814/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.1045 - mean_absolute_error: 0.2271 - mean_squared_error: 0.1045 - val_loss: 0.0801 - val_mean_absolute_error: 0.1968 - val_mean_squared_error: 0.0801
Epoch 815/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0788 - mean_absolute_error: 0.1972 - mean_squared_error: 0.0788 - val_loss: 0.0919 - val_mean_absolute_error: 0.2204 - val_mean_squared_error: 0.0919
Epoch 816/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0677 - mean_absolute_error: 0.1943 - mean_squared_error: 0.0677 - val_loss: 0.0685 - val_mean_absolute_error: 0.1915 - val_mean_squared_error: 0.0685
Epoch 817/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0629 - mean_absolute_error: 0.1813 - mean_squared_error: 0.0629 - val_loss: 0.0593 - val_mean_absolute_error: 0.1663 - val_mean_squared_error: 0.0593
Epoch 818/1000
8000/8000 [==============================] - 1s 141us/step - loss: 0.0852 - mean_absolute_error: 0.1993 - mean_squared_error: 0.0852 - val_loss: 0.0614 - val_mean_absolute_error: 0.1651 - val_mean_squared_error: 0.0614
Epoch 819/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.0634 - mean_absolute_error: 0.1692 - mean_squared_error: 0.0634 - val_loss: 0.0714 - val_mean_absolute_error: 0.1817 - val_mean_squared_error: 0.0714
Epoch 820/1000
8000/8000 [==============================] - 1s 142us/step - loss: 0.0821 - mean_absolute_error: 0.1993 - mean_squared_error: 0.0821 - val_loss: 0.0682 - val_mean_absolute_error: 0.1837 - val_mean_squared_error: 0.0682
Epoch 821/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.0838 - mean_absolute_error: 0.2049 - mean_squared_error: 0.0838 - val_loss: 0.0901 - val_mean_absolute_error: 0.2191 - val_mean_squared_error: 0.0901
Epoch 822/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0966 - mean_absolute_error: 0.2324 - mean_squared_error: 0.0966 - val_loss: 0.0798 - val_mean_absolute_error: 0.2124 - val_mean_squared_error: 0.0798
Epoch 823/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.0644 - mean_absolute_error: 0.1842 - mean_squared_error: 0.0644 - val_loss: 0.0678 - val_mean_absolute_error: 0.1839 - val_mean_squared_error: 0.0678
Epoch 824/1000
8000/8000 [==============================] - 1s 140us/step - loss: 0.0886 - mean_absolute_error: 0.2091 - mean_squared_error: 0.0886 - val_loss: 0.0858 - val_mean_absolute_error: 0.1976 - val_mean_squared_error: 0.0858
Epoch 825/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0633 - mean_absolute_error: 0.1720 - mean_squared_error: 0.0633 - val_loss: 0.0603 - val_mean_absolute_error: 0.1602 - val_mean_squared_error: 0.0603
Epoch 826/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0649 - mean_absolute_error: 0.1731 - mean_squared_error: 0.0649 - val_loss: 0.1308 - val_mean_absolute_error: 0.2359 - val_mean_squared_error: 0.1308
Epoch 827/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.0757 - mean_absolute_error: 0.1943 - mean_squared_error: 0.0757 - val_loss: 0.0605 - val_mean_absolute_error: 0.1618 - val_mean_squared_error: 0.0605
Epoch 828/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0649 - mean_absolute_error: 0.1767 - mean_squared_error: 0.0649 - val_loss: 0.0566 - val_mean_absolute_error: 0.1560 - val_mean_squared_error: 0.0566
Epoch 829/1000
8000/8000 [==============================] - 1s 142us/step - loss: 0.0553 - mean_absolute_error: 0.1633 - mean_squared_error: 0.0553 - val_loss: 0.1040 - val_mean_absolute_error: 0.2201 - val_mean_squared_error: 0.1040
Epoch 830/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.1193 - mean_absolute_error: 0.2393 - mean_squared_error: 0.1193 - val_loss: 0.0761 - val_mean_absolute_error: 0.1899 - val_mean_squared_error: 0.0761
Epoch 831/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0665 - mean_absolute_error: 0.1825 - mean_squared_error: 0.0665 - val_loss: 0.0766 - val_mean_absolute_error: 0.2008 - val_mean_squared_error: 0.0766
Epoch 832/1000
8000/8000 [==============================] - 1s 140us/step - loss: 0.0591 - mean_absolute_error: 0.1747 - mean_squared_error: 0.0591 - val_loss: 0.0601 - val_mean_absolute_error: 0.1630 - val_mean_squared_error: 0.0601
Epoch 833/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.1171 - mean_absolute_error: 0.2506 - mean_squared_error: 0.1171 - val_loss: 0.0821 - val_mean_absolute_error: 0.1899 - val_mean_squared_error: 0.0821
Epoch 834/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.0690 - mean_absolute_error: 0.1905 - mean_squared_error: 0.0690 - val_loss: 0.0616 - val_mean_absolute_error: 0.1687 - val_mean_squared_error: 0.0616
Epoch 835/1000
8000/8000 [==============================] - 1s 140us/step - loss: 0.0794 - mean_absolute_error: 0.1943 - mean_squared_error: 0.0794 - val_loss: 0.0774 - val_mean_absolute_error: 0.1936 - val_mean_squared_error: 0.0774
Epoch 836/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0740 - mean_absolute_error: 0.1888 - mean_squared_error: 0.0740 - val_loss: 0.0917 - val_mean_absolute_error: 0.2051 - val_mean_squared_error: 0.0917
Epoch 837/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.0622 - mean_absolute_error: 0.1728 - mean_squared_error: 0.0622 - val_loss: 0.0605 - val_mean_absolute_error: 0.1667 - val_mean_squared_error: 0.0605
Epoch 838/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0633 - mean_absolute_error: 0.1739 - mean_squared_error: 0.0633 - val_loss: 0.0565 - val_mean_absolute_error: 0.1578 - val_mean_squared_error: 0.0565
Epoch 839/1000
8000/8000 [==============================] - 1s 144us/step - loss: 0.0804 - mean_absolute_error: 0.1923 - mean_squared_error: 0.0804 - val_loss: 0.0774 - val_mean_absolute_error: 0.1908 - val_mean_squared_error: 0.0774
Epoch 840/1000
8000/8000 [==============================] - 1s 140us/step - loss: 0.0620 - mean_absolute_error: 0.1738 - mean_squared_error: 0.0620 - val_loss: 0.0600 - val_mean_absolute_error: 0.1664 - val_mean_squared_error: 0.0600
Epoch 841/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.0568 - mean_absolute_error: 0.1633 - mean_squared_error: 0.0568 - val_loss: 0.0568 - val_mean_absolute_error: 0.1574 - val_mean_squared_error: 0.0568
Epoch 842/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.0864 - mean_absolute_error: 0.1983 - mean_squared_error: 0.0864 - val_loss: 0.1292 - val_mean_absolute_error: 0.2647 - val_mean_squared_error: 0.1292
Epoch 843/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0704 - mean_absolute_error: 0.1898 - mean_squared_error: 0.0704 - val_loss: 0.0606 - val_mean_absolute_error: 0.1649 - val_mean_squared_error: 0.0606
Epoch 844/1000
8000/8000 [==============================] - 1s 140us/step - loss: 0.0597 - mean_absolute_error: 0.1688 - mean_squared_error: 0.0597 - val_loss: 0.0582 - val_mean_absolute_error: 0.1621 - val_mean_squared_error: 0.0582
Epoch 845/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0800 - mean_absolute_error: 0.1936 - mean_squared_error: 0.0800 - val_loss: 0.0686 - val_mean_absolute_error: 0.1855 - val_mean_squared_error: 0.0686
Epoch 846/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0659 - mean_absolute_error: 0.1819 - mean_squared_error: 0.0659 - val_loss: 0.0562 - val_mean_absolute_error: 0.1556 - val_mean_squared_error: 0.0562
Epoch 847/1000
8000/8000 [==============================] - 1s 143us/step - loss: 0.0630 - mean_absolute_error: 0.1762 - mean_squared_error: 0.0630 - val_loss: 0.1467 - val_mean_absolute_error: 0.2682 - val_mean_squared_error: 0.1467
Epoch 848/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.1121 - mean_absolute_error: 0.2325 - mean_squared_error: 0.1121 - val_loss: 0.1027 - val_mean_absolute_error: 0.2160 - val_mean_squared_error: 0.1027
Epoch 849/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.0650 - mean_absolute_error: 0.1754 - mean_squared_error: 0.0650 - val_loss: 0.0729 - val_mean_absolute_error: 0.1873 - val_mean_squared_error: 0.0729
Epoch 850/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.0613 - mean_absolute_error: 0.1747 - mean_squared_error: 0.0613 - val_loss: 0.0733 - val_mean_absolute_error: 0.1824 - val_mean_squared_error: 0.0733
Epoch 851/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0710 - mean_absolute_error: 0.1884 - mean_squared_error: 0.0710 - val_loss: 0.1049 - val_mean_absolute_error: 0.2232 - val_mean_squared_error: 0.1049
Epoch 852/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0920 - mean_absolute_error: 0.2093 - mean_squared_error: 0.0920 - val_loss: 0.0784 - val_mean_absolute_error: 0.2019 - val_mean_squared_error: 0.0784
Epoch 853/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0636 - mean_absolute_error: 0.1738 - mean_squared_error: 0.0636 - val_loss: 0.0717 - val_mean_absolute_error: 0.1773 - val_mean_squared_error: 0.0717
Epoch 854/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0533 - mean_absolute_error: 0.1555 - mean_squared_error: 0.0533 - val_loss: 0.0624 - val_mean_absolute_error: 0.1662 - val_mean_squared_error: 0.0624
Epoch 855/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0635 - mean_absolute_error: 0.1728 - mean_squared_error: 0.0635 - val_loss: 0.0587 - val_mean_absolute_error: 0.1660 - val_mean_squared_error: 0.0587
Epoch 856/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0564 - mean_absolute_error: 0.1653 - mean_squared_error: 0.0564 - val_loss: 0.0780 - val_mean_absolute_error: 0.1909 - val_mean_squared_error: 0.0780
Epoch 857/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.1260 - mean_absolute_error: 0.2346 - mean_squared_error: 0.1260 - val_loss: 0.0701 - val_mean_absolute_error: 0.1818 - val_mean_squared_error: 0.0701
Epoch 858/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0736 - mean_absolute_error: 0.1846 - mean_squared_error: 0.0736 - val_loss: 0.0644 - val_mean_absolute_error: 0.1659 - val_mean_squared_error: 0.0644
Epoch 859/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.0561 - mean_absolute_error: 0.1609 - mean_squared_error: 0.0561 - val_loss: 0.0637 - val_mean_absolute_error: 0.1685 - val_mean_squared_error: 0.0637
Epoch 860/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.0586 - mean_absolute_error: 0.1697 - mean_squared_error: 0.0586 - val_loss: 0.0580 - val_mean_absolute_error: 0.1628 - val_mean_squared_error: 0.0580
Epoch 861/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0550 - mean_absolute_error: 0.1632 - mean_squared_error: 0.0550 - val_loss: 0.0753 - val_mean_absolute_error: 0.1889 - val_mean_squared_error: 0.0753
Epoch 862/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0863 - mean_absolute_error: 0.2015 - mean_squared_error: 0.0863 - val_loss: 0.0945 - val_mean_absolute_error: 0.2007 - val_mean_squared_error: 0.0945
Epoch 863/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.0752 - mean_absolute_error: 0.1914 - mean_squared_error: 0.0752 - val_loss: 0.0723 - val_mean_absolute_error: 0.1924 - val_mean_squared_error: 0.0723
Epoch 864/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0548 - mean_absolute_error: 0.1658 - mean_squared_error: 0.0548 - val_loss: 0.0565 - val_mean_absolute_error: 0.1593 - val_mean_squared_error: 0.0565
Epoch 865/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0639 - mean_absolute_error: 0.1724 - mean_squared_error: 0.0639 - val_loss: 0.0591 - val_mean_absolute_error: 0.1662 - val_mean_squared_error: 0.0591
Epoch 866/1000
8000/8000 [==============================] - 1s 135us/step - loss: 0.0629 - mean_absolute_error: 0.1709 - mean_squared_error: 0.0629 - val_loss: 0.0552 - val_mean_absolute_error: 0.1539 - val_mean_squared_error: 0.0552
Epoch 867/1000
8000/8000 [==============================] - 1s 148us/step - loss: 0.0670 - mean_absolute_error: 0.1752 - mean_squared_error: 0.0670 - val_loss: 0.0607 - val_mean_absolute_error: 0.1628 - val_mean_squared_error: 0.0607
Epoch 868/1000
8000/8000 [==============================] - 1s 140us/step - loss: 0.0609 - mean_absolute_error: 0.1676 - mean_squared_error: 0.0609 - val_loss: 0.1024 - val_mean_absolute_error: 0.2181 - val_mean_squared_error: 0.1024
Epoch 869/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0872 - mean_absolute_error: 0.2089 - mean_squared_error: 0.0872 - val_loss: 0.0808 - val_mean_absolute_error: 0.1959 - val_mean_squared_error: 0.0808
Epoch 870/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0630 - mean_absolute_error: 0.1751 - mean_squared_error: 0.0630 - val_loss: 0.0690 - val_mean_absolute_error: 0.1800 - val_mean_squared_error: 0.0690
Epoch 871/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0706 - mean_absolute_error: 0.1867 - mean_squared_error: 0.0706 - val_loss: 0.1123 - val_mean_absolute_error: 0.2391 - val_mean_squared_error: 0.1123
Epoch 872/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0763 - mean_absolute_error: 0.1899 - mean_squared_error: 0.0763 - val_loss: 0.0628 - val_mean_absolute_error: 0.1778 - val_mean_squared_error: 0.0628
Epoch 873/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0831 - mean_absolute_error: 0.1977 - mean_squared_error: 0.0831 - val_loss: 0.0819 - val_mean_absolute_error: 0.1911 - val_mean_squared_error: 0.0819
Epoch 874/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0573 - mean_absolute_error: 0.1645 - mean_squared_error: 0.0573 - val_loss: 0.0590 - val_mean_absolute_error: 0.1664 - val_mean_squared_error: 0.0590
Epoch 875/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0634 - mean_absolute_error: 0.1770 - mean_squared_error: 0.0634 - val_loss: 0.0613 - val_mean_absolute_error: 0.1705 - val_mean_squared_error: 0.0613
Epoch 876/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0612 - mean_absolute_error: 0.1737 - mean_squared_error: 0.0612 - val_loss: 0.0781 - val_mean_absolute_error: 0.2005 - val_mean_squared_error: 0.0781
Epoch 877/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0735 - mean_absolute_error: 0.1870 - mean_squared_error: 0.0735 - val_loss: 0.0691 - val_mean_absolute_error: 0.1906 - val_mean_squared_error: 0.0691
Epoch 878/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0656 - mean_absolute_error: 0.1822 - mean_squared_error: 0.0656 - val_loss: 0.0830 - val_mean_absolute_error: 0.1992 - val_mean_squared_error: 0.0830
Epoch 879/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.0643 - mean_absolute_error: 0.1743 - mean_squared_error: 0.0643 - val_loss: 0.0923 - val_mean_absolute_error: 0.2149 - val_mean_squared_error: 0.0923
Epoch 880/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.0834 - mean_absolute_error: 0.2138 - mean_squared_error: 0.0834 - val_loss: 0.0926 - val_mean_absolute_error: 0.2081 - val_mean_squared_error: 0.0926
Epoch 881/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0748 - mean_absolute_error: 0.1885 - mean_squared_error: 0.0748 - val_loss: 0.0707 - val_mean_absolute_error: 0.1808 - val_mean_squared_error: 0.0707
Epoch 882/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.0599 - mean_absolute_error: 0.1734 - mean_squared_error: 0.0599 - val_loss: 0.0560 - val_mean_absolute_error: 0.1571 - val_mean_squared_error: 0.0560
Epoch 883/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0886 - mean_absolute_error: 0.2029 - mean_squared_error: 0.0886 - val_loss: 0.1061 - val_mean_absolute_error: 0.2240 - val_mean_squared_error: 0.1061
Epoch 884/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0628 - mean_absolute_error: 0.1705 - mean_squared_error: 0.0628 - val_loss: 0.0705 - val_mean_absolute_error: 0.1785 - val_mean_squared_error: 0.0705
Epoch 885/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.0519 - mean_absolute_error: 0.1571 - mean_squared_error: 0.0519 - val_loss: 0.0578 - val_mean_absolute_error: 0.1605 - val_mean_squared_error: 0.0578
Epoch 886/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.0986 - mean_absolute_error: 0.2149 - mean_squared_error: 0.0986 - val_loss: 0.0740 - val_mean_absolute_error: 0.1935 - val_mean_squared_error: 0.0740
Epoch 887/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.0678 - mean_absolute_error: 0.1844 - mean_squared_error: 0.0678 - val_loss: 0.0797 - val_mean_absolute_error: 0.2144 - val_mean_squared_error: 0.0797
Epoch 888/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.0737 - mean_absolute_error: 0.2006 - mean_squared_error: 0.0737 - val_loss: 0.0642 - val_mean_absolute_error: 0.1771 - val_mean_squared_error: 0.0642
Epoch 889/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0747 - mean_absolute_error: 0.1953 - mean_squared_error: 0.0747 - val_loss: 0.0567 - val_mean_absolute_error: 0.1581 - val_mean_squared_error: 0.0567
Epoch 890/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0759 - mean_absolute_error: 0.1938 - mean_squared_error: 0.0759 - val_loss: 0.0664 - val_mean_absolute_error: 0.1755 - val_mean_squared_error: 0.0664
Epoch 891/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0612 - mean_absolute_error: 0.1739 - mean_squared_error: 0.0612 - val_loss: 0.1116 - val_mean_absolute_error: 0.2635 - val_mean_squared_error: 0.1116
Epoch 892/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0854 - mean_absolute_error: 0.2203 - mean_squared_error: 0.0854 - val_loss: 0.1427 - val_mean_absolute_error: 0.2513 - val_mean_squared_error: 0.1427
Epoch 893/1000
8000/8000 [==============================] - 1s 140us/step - loss: 0.0976 - mean_absolute_error: 0.2264 - mean_squared_error: 0.0976 - val_loss: 0.0908 - val_mean_absolute_error: 0.2089 - val_mean_squared_error: 0.0908
Epoch 894/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0686 - mean_absolute_error: 0.1881 - mean_squared_error: 0.0686 - val_loss: 0.0678 - val_mean_absolute_error: 0.1931 - val_mean_squared_error: 0.0678
Epoch 895/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.0739 - mean_absolute_error: 0.1939 - mean_squared_error: 0.0739 - val_loss: 0.1066 - val_mean_absolute_error: 0.2435 - val_mean_squared_error: 0.1066
Epoch 896/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0765 - mean_absolute_error: 0.1984 - mean_squared_error: 0.0765 - val_loss: 0.1140 - val_mean_absolute_error: 0.2447 - val_mean_squared_error: 0.1140
Epoch 897/1000
8000/8000 [==============================] - 1s 140us/step - loss: 0.0675 - mean_absolute_error: 0.1858 - mean_squared_error: 0.0675 - val_loss: 0.0753 - val_mean_absolute_error: 0.2014 - val_mean_squared_error: 0.0753
Epoch 898/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0586 - mean_absolute_error: 0.1722 - mean_squared_error: 0.0586 - val_loss: 0.0699 - val_mean_absolute_error: 0.1871 - val_mean_squared_error: 0.0699
Epoch 899/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0729 - mean_absolute_error: 0.1880 - mean_squared_error: 0.0729 - val_loss: 0.0598 - val_mean_absolute_error: 0.1672 - val_mean_squared_error: 0.0598
Epoch 900/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0547 - mean_absolute_error: 0.1616 - mean_squared_error: 0.0547 - val_loss: 0.1049 - val_mean_absolute_error: 0.2191 - val_mean_squared_error: 0.1049
Epoch 901/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.0808 - mean_absolute_error: 0.1956 - mean_squared_error: 0.0808 - val_loss: 0.0572 - val_mean_absolute_error: 0.1611 - val_mean_squared_error: 0.0572
Epoch 902/1000
8000/8000 [==============================] - 1s 134us/step - loss: 0.0535 - mean_absolute_error: 0.1609 - mean_squared_error: 0.0535 - val_loss: 0.0580 - val_mean_absolute_error: 0.1591 - val_mean_squared_error: 0.0580
Epoch 903/1000
8000/8000 [==============================] - 1s 135us/step - loss: 0.0535 - mean_absolute_error: 0.1589 - mean_squared_error: 0.0535 - val_loss: 0.0592 - val_mean_absolute_error: 0.1599 - val_mean_squared_error: 0.0592
Epoch 904/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0490 - mean_absolute_error: 0.1504 - mean_squared_error: 0.0490 - val_loss: 0.0682 - val_mean_absolute_error: 0.1768 - val_mean_squared_error: 0.0682
Epoch 905/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.1215 - mean_absolute_error: 0.2357 - mean_squared_error: 0.1215 - val_loss: 0.0820 - val_mean_absolute_error: 0.2016 - val_mean_squared_error: 0.0820
Epoch 906/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0665 - mean_absolute_error: 0.1814 - mean_squared_error: 0.0665 - val_loss: 0.0649 - val_mean_absolute_error: 0.1678 - val_mean_squared_error: 0.0649
Epoch 907/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0737 - mean_absolute_error: 0.1925 - mean_squared_error: 0.0737 - val_loss: 0.0600 - val_mean_absolute_error: 0.1656 - val_mean_squared_error: 0.0600
Epoch 908/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0634 - mean_absolute_error: 0.1779 - mean_squared_error: 0.0634 - val_loss: 0.0578 - val_mean_absolute_error: 0.1644 - val_mean_squared_error: 0.0578
Epoch 909/1000
8000/8000 [==============================] - 1s 140us/step - loss: 0.0567 - mean_absolute_error: 0.1676 - mean_squared_error: 0.0567 - val_loss: 0.0548 - val_mean_absolute_error: 0.1526 - val_mean_squared_error: 0.0548
Epoch 910/1000
8000/8000 [==============================] - 1s 145us/step - loss: 0.0481 - mean_absolute_error: 0.1487 - mean_squared_error: 0.0481 - val_loss: 0.0550 - val_mean_absolute_error: 0.1570 - val_mean_squared_error: 0.0550
Epoch 911/1000
8000/8000 [==============================] - 1s 140us/step - loss: 0.1244 - mean_absolute_error: 0.2380 - mean_squared_error: 0.1244 - val_loss: 0.0787 - val_mean_absolute_error: 0.2000 - val_mean_squared_error: 0.0787
Epoch 912/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.0612 - mean_absolute_error: 0.1709 - mean_squared_error: 0.0612 - val_loss: 0.0560 - val_mean_absolute_error: 0.1522 - val_mean_squared_error: 0.0560
Epoch 913/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.0561 - mean_absolute_error: 0.1616 - mean_squared_error: 0.0561 - val_loss: 0.0748 - val_mean_absolute_error: 0.1908 - val_mean_squared_error: 0.0748
Epoch 914/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.0834 - mean_absolute_error: 0.2065 - mean_squared_error: 0.0834 - val_loss: 0.0731 - val_mean_absolute_error: 0.1806 - val_mean_squared_error: 0.0731
Epoch 915/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0622 - mean_absolute_error: 0.1821 - mean_squared_error: 0.0622 - val_loss: 0.0729 - val_mean_absolute_error: 0.1900 - val_mean_squared_error: 0.0729
Epoch 916/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0678 - mean_absolute_error: 0.1902 - mean_squared_error: 0.0678 - val_loss: 0.0838 - val_mean_absolute_error: 0.2096 - val_mean_squared_error: 0.0838
Epoch 917/1000
8000/8000 [==============================] - 1s 141us/step - loss: 0.0689 - mean_absolute_error: 0.1821 - mean_squared_error: 0.0689 - val_loss: 0.0530 - val_mean_absolute_error: 0.1512 - val_mean_squared_error: 0.0530
Epoch 918/1000
8000/8000 [==============================] - 1s 143us/step - loss: 0.0570 - mean_absolute_error: 0.1616 - mean_squared_error: 0.0570 - val_loss: 0.0639 - val_mean_absolute_error: 0.1715 - val_mean_squared_error: 0.0639
Epoch 919/1000
8000/8000 [==============================] - 1s 140us/step - loss: 0.0519 - mean_absolute_error: 0.1588 - mean_squared_error: 0.0519 - val_loss: 0.0570 - val_mean_absolute_error: 0.1578 - val_mean_squared_error: 0.0570
Epoch 920/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.1982 - mean_absolute_error: 0.2756 - mean_squared_error: 0.1982 - val_loss: 0.2030 - val_mean_absolute_error: 0.3175 - val_mean_squared_error: 0.2030
Epoch 921/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0998 - mean_absolute_error: 0.2098 - mean_squared_error: 0.0998 - val_loss: 0.0860 - val_mean_absolute_error: 0.1905 - val_mean_squared_error: 0.0860
Epoch 922/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0673 - mean_absolute_error: 0.1745 - mean_squared_error: 0.0673 - val_loss: 0.0806 - val_mean_absolute_error: 0.2123 - val_mean_squared_error: 0.0806
Epoch 923/1000
8000/8000 [==============================] - 1s 140us/step - loss: 0.0633 - mean_absolute_error: 0.1804 - mean_squared_error: 0.0633 - val_loss: 0.0647 - val_mean_absolute_error: 0.1769 - val_mean_squared_error: 0.0647
Epoch 924/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0599 - mean_absolute_error: 0.1749 - mean_squared_error: 0.0599 - val_loss: 0.0941 - val_mean_absolute_error: 0.2292 - val_mean_squared_error: 0.0941
Epoch 925/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.0594 - mean_absolute_error: 0.1716 - mean_squared_error: 0.0594 - val_loss: 0.0615 - val_mean_absolute_error: 0.1662 - val_mean_squared_error: 0.0615
Epoch 926/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0748 - mean_absolute_error: 0.1982 - mean_squared_error: 0.0748 - val_loss: 0.1452 - val_mean_absolute_error: 0.2885 - val_mean_squared_error: 0.1452
Epoch 927/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.0934 - mean_absolute_error: 0.2305 - mean_squared_error: 0.0934 - val_loss: 0.0806 - val_mean_absolute_error: 0.1957 - val_mean_squared_error: 0.0806
Epoch 928/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0674 - mean_absolute_error: 0.1942 - mean_squared_error: 0.0674 - val_loss: 0.0760 - val_mean_absolute_error: 0.2153 - val_mean_squared_error: 0.0760
Epoch 929/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0757 - mean_absolute_error: 0.1998 - mean_squared_error: 0.0757 - val_loss: 0.0629 - val_mean_absolute_error: 0.1769 - val_mean_squared_error: 0.0629
Epoch 930/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0875 - mean_absolute_error: 0.2050 - mean_squared_error: 0.0875 - val_loss: 0.0554 - val_mean_absolute_error: 0.1527 - val_mean_squared_error: 0.0554
Epoch 931/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0538 - mean_absolute_error: 0.1567 - mean_squared_error: 0.0538 - val_loss: 0.0694 - val_mean_absolute_error: 0.1842 - val_mean_squared_error: 0.0694
Epoch 932/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0549 - mean_absolute_error: 0.1674 - mean_squared_error: 0.0549 - val_loss: 0.0648 - val_mean_absolute_error: 0.1865 - val_mean_squared_error: 0.0648
Epoch 933/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0655 - mean_absolute_error: 0.1792 - mean_squared_error: 0.0655 - val_loss: 0.1031 - val_mean_absolute_error: 0.2130 - val_mean_squared_error: 0.1031
Epoch 934/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.0663 - mean_absolute_error: 0.1805 - mean_squared_error: 0.0663 - val_loss: 0.0705 - val_mean_absolute_error: 0.1846 - val_mean_squared_error: 0.0705
Epoch 935/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0560 - mean_absolute_error: 0.1676 - mean_squared_error: 0.0560 - val_loss: 0.0900 - val_mean_absolute_error: 0.2023 - val_mean_squared_error: 0.0900
Epoch 936/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0876 - mean_absolute_error: 0.2016 - mean_squared_error: 0.0876 - val_loss: 0.1085 - val_mean_absolute_error: 0.2306 - val_mean_squared_error: 0.1085
Epoch 937/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0644 - mean_absolute_error: 0.1768 - mean_squared_error: 0.0644 - val_loss: 0.0995 - val_mean_absolute_error: 0.2188 - val_mean_squared_error: 0.0995
Epoch 938/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0927 - mean_absolute_error: 0.2264 - mean_squared_error: 0.0927 - val_loss: 0.1146 - val_mean_absolute_error: 0.2383 - val_mean_squared_error: 0.1146
Epoch 939/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0725 - mean_absolute_error: 0.1935 - mean_squared_error: 0.0725 - val_loss: 0.0691 - val_mean_absolute_error: 0.1940 - val_mean_squared_error: 0.0691
Epoch 940/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0631 - mean_absolute_error: 0.1808 - mean_squared_error: 0.0631 - val_loss: 0.1191 - val_mean_absolute_error: 0.2544 - val_mean_squared_error: 0.1191
Epoch 941/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.0628 - mean_absolute_error: 0.1743 - mean_squared_error: 0.0628 - val_loss: 0.0601 - val_mean_absolute_error: 0.1649 - val_mean_squared_error: 0.0601
Epoch 942/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.0652 - mean_absolute_error: 0.1772 - mean_squared_error: 0.0652 - val_loss: 0.0734 - val_mean_absolute_error: 0.1892 - val_mean_squared_error: 0.0734
Epoch 943/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0586 - mean_absolute_error: 0.1688 - mean_squared_error: 0.0586 - val_loss: 0.0665 - val_mean_absolute_error: 0.1776 - val_mean_squared_error: 0.0665
Epoch 944/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.0673 - mean_absolute_error: 0.1804 - mean_squared_error: 0.0673 - val_loss: 0.0669 - val_mean_absolute_error: 0.1745 - val_mean_squared_error: 0.0669
Epoch 945/1000
8000/8000 [==============================] - 1s 140us/step - loss: 0.0646 - mean_absolute_error: 0.1757 - mean_squared_error: 0.0646 - val_loss: 0.0603 - val_mean_absolute_error: 0.1665 - val_mean_squared_error: 0.0603
Epoch 946/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.0503 - mean_absolute_error: 0.1550 - mean_squared_error: 0.0503 - val_loss: 0.0604 - val_mean_absolute_error: 0.1678 - val_mean_squared_error: 0.0604
Epoch 947/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0774 - mean_absolute_error: 0.1922 - mean_squared_error: 0.0774 - val_loss: 0.1085 - val_mean_absolute_error: 0.2241 - val_mean_squared_error: 0.1085
Epoch 948/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.0714 - mean_absolute_error: 0.1843 - mean_squared_error: 0.0714 - val_loss: 0.0658 - val_mean_absolute_error: 0.1771 - val_mean_squared_error: 0.0658
Epoch 949/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.0686 - mean_absolute_error: 0.1802 - mean_squared_error: 0.0686 - val_loss: 0.0752 - val_mean_absolute_error: 0.1811 - val_mean_squared_error: 0.0752
Epoch 950/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0569 - mean_absolute_error: 0.1643 - mean_squared_error: 0.0569 - val_loss: 0.0656 - val_mean_absolute_error: 0.1754 - val_mean_squared_error: 0.0656
Epoch 951/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0500 - mean_absolute_error: 0.1536 - mean_squared_error: 0.0500 - val_loss: 0.0552 - val_mean_absolute_error: 0.1569 - val_mean_squared_error: 0.0552
Epoch 952/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0796 - mean_absolute_error: 0.1895 - mean_squared_error: 0.0796 - val_loss: 0.0548 - val_mean_absolute_error: 0.1530 - val_mean_squared_error: 0.0548
Epoch 953/1000
8000/8000 [==============================] - 1s 140us/step - loss: 0.0516 - mean_absolute_error: 0.1545 - mean_squared_error: 0.0516 - val_loss: 0.0584 - val_mean_absolute_error: 0.1588 - val_mean_squared_error: 0.0584
Epoch 954/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0522 - mean_absolute_error: 0.1550 - mean_squared_error: 0.0522 - val_loss: 0.0862 - val_mean_absolute_error: 0.1945 - val_mean_squared_error: 0.0862
Epoch 955/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0970 - mean_absolute_error: 0.2118 - mean_squared_error: 0.0970 - val_loss: 0.0556 - val_mean_absolute_error: 0.1566 - val_mean_squared_error: 0.0556
Epoch 956/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.0493 - mean_absolute_error: 0.1512 - mean_squared_error: 0.0493 - val_loss: 0.0525 - val_mean_absolute_error: 0.1513 - val_mean_squared_error: 0.0525
Epoch 957/1000
8000/8000 [==============================] - 1s 144us/step - loss: 0.0479 - mean_absolute_error: 0.1498 - mean_squared_error: 0.0479 - val_loss: 0.0763 - val_mean_absolute_error: 0.1920 - val_mean_squared_error: 0.0763
Epoch 958/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0709 - mean_absolute_error: 0.1832 - mean_squared_error: 0.0709 - val_loss: 0.0723 - val_mean_absolute_error: 0.1789 - val_mean_squared_error: 0.0723
Epoch 959/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0605 - mean_absolute_error: 0.1692 - mean_squared_error: 0.0605 - val_loss: 0.0791 - val_mean_absolute_error: 0.1933 - val_mean_squared_error: 0.0791
Epoch 960/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0597 - mean_absolute_error: 0.1686 - mean_squared_error: 0.0597 - val_loss: 0.1090 - val_mean_absolute_error: 0.2230 - val_mean_squared_error: 0.1090
Epoch 961/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.0668 - mean_absolute_error: 0.1793 - mean_squared_error: 0.0668 - val_loss: 0.0573 - val_mean_absolute_error: 0.1647 - val_mean_squared_error: 0.0573
Epoch 962/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0722 - mean_absolute_error: 0.1904 - mean_squared_error: 0.0722 - val_loss: 0.0878 - val_mean_absolute_error: 0.2195 - val_mean_squared_error: 0.0878
Epoch 963/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0622 - mean_absolute_error: 0.1780 - mean_squared_error: 0.0622 - val_loss: 0.0627 - val_mean_absolute_error: 0.1722 - val_mean_squared_error: 0.0627
Epoch 964/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.0609 - mean_absolute_error: 0.1721 - mean_squared_error: 0.0609 - val_loss: 0.0566 - val_mean_absolute_error: 0.1644 - val_mean_squared_error: 0.0566
Epoch 965/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0585 - mean_absolute_error: 0.1669 - mean_squared_error: 0.0585 - val_loss: 0.0536 - val_mean_absolute_error: 0.1564 - val_mean_squared_error: 0.0536
Epoch 966/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0508 - mean_absolute_error: 0.1544 - mean_squared_error: 0.0508 - val_loss: 0.1731 - val_mean_absolute_error: 0.2927 - val_mean_squared_error: 0.1731
Epoch 967/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.1145 - mean_absolute_error: 0.2333 - mean_squared_error: 0.1145 - val_loss: 0.0833 - val_mean_absolute_error: 0.1908 - val_mean_squared_error: 0.0833
Epoch 968/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0683 - mean_absolute_error: 0.1851 - mean_squared_error: 0.0683 - val_loss: 0.0777 - val_mean_absolute_error: 0.1962 - val_mean_squared_error: 0.0777
Epoch 969/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0719 - mean_absolute_error: 0.1990 - mean_squared_error: 0.0719 - val_loss: 0.0537 - val_mean_absolute_error: 0.1554 - val_mean_squared_error: 0.0537
Epoch 970/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.0593 - mean_absolute_error: 0.1744 - mean_squared_error: 0.0593 - val_loss: 0.0689 - val_mean_absolute_error: 0.1911 - val_mean_squared_error: 0.0689
Epoch 971/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.1004 - mean_absolute_error: 0.2298 - mean_squared_error: 0.1004 - val_loss: 0.0540 - val_mean_absolute_error: 0.1523 - val_mean_squared_error: 0.0540
Epoch 972/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.0663 - mean_absolute_error: 0.1853 - mean_squared_error: 0.0663 - val_loss: 0.0621 - val_mean_absolute_error: 0.1701 - val_mean_squared_error: 0.0621
Epoch 973/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.0576 - mean_absolute_error: 0.1724 - mean_squared_error: 0.0576 - val_loss: 0.0600 - val_mean_absolute_error: 0.1648 - val_mean_squared_error: 0.0600
Epoch 974/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.0546 - mean_absolute_error: 0.1608 - mean_squared_error: 0.0546 - val_loss: 0.0749 - val_mean_absolute_error: 0.1923 - val_mean_squared_error: 0.0749
Epoch 975/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.1033 - mean_absolute_error: 0.2214 - mean_squared_error: 0.1033 - val_loss: 0.0620 - val_mean_absolute_error: 0.1701 - val_mean_squared_error: 0.0620
Epoch 976/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0593 - mean_absolute_error: 0.1680 - mean_squared_error: 0.0593 - val_loss: 0.0525 - val_mean_absolute_error: 0.1482 - val_mean_squared_error: 0.0525
Epoch 977/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0578 - mean_absolute_error: 0.1650 - mean_squared_error: 0.0578 - val_loss: 0.0736 - val_mean_absolute_error: 0.1861 - val_mean_squared_error: 0.0736
Epoch 978/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0641 - mean_absolute_error: 0.1754 - mean_squared_error: 0.0641 - val_loss: 0.0598 - val_mean_absolute_error: 0.1621 - val_mean_squared_error: 0.0598
Epoch 979/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.0582 - mean_absolute_error: 0.1639 - mean_squared_error: 0.0582 - val_loss: 0.0597 - val_mean_absolute_error: 0.1735 - val_mean_squared_error: 0.0597
Epoch 980/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0731 - mean_absolute_error: 0.1860 - mean_squared_error: 0.0731 - val_loss: 0.0760 - val_mean_absolute_error: 0.1975 - val_mean_squared_error: 0.0760
Epoch 981/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0583 - mean_absolute_error: 0.1664 - mean_squared_error: 0.0583 - val_loss: 0.0515 - val_mean_absolute_error: 0.1479 - val_mean_squared_error: 0.0515
Epoch 982/1000
8000/8000 [==============================] - 1s 148us/step - loss: 0.0501 - mean_absolute_error: 0.1540 - mean_squared_error: 0.0501 - val_loss: 0.0531 - val_mean_absolute_error: 0.1537 - val_mean_squared_error: 0.0531
Epoch 983/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0678 - mean_absolute_error: 0.1761 - mean_squared_error: 0.0678 - val_loss: 0.0853 - val_mean_absolute_error: 0.2048 - val_mean_squared_error: 0.0853
Epoch 984/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0880 - mean_absolute_error: 0.2071 - mean_squared_error: 0.0880 - val_loss: 0.0723 - val_mean_absolute_error: 0.1965 - val_mean_squared_error: 0.0723
Epoch 985/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0598 - mean_absolute_error: 0.1702 - mean_squared_error: 0.0598 - val_loss: 0.0577 - val_mean_absolute_error: 0.1641 - val_mean_squared_error: 0.0577
Epoch 986/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0538 - mean_absolute_error: 0.1623 - mean_squared_error: 0.0538 - val_loss: 0.0773 - val_mean_absolute_error: 0.1926 - val_mean_squared_error: 0.0773
Epoch 987/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0561 - mean_absolute_error: 0.1674 - mean_squared_error: 0.0561 - val_loss: 0.0546 - val_mean_absolute_error: 0.1541 - val_mean_squared_error: 0.0546
Epoch 988/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0697 - mean_absolute_error: 0.1777 - mean_squared_error: 0.0697 - val_loss: 0.0541 - val_mean_absolute_error: 0.1530 - val_mean_squared_error: 0.0541
Epoch 989/1000
8000/8000 [==============================] - 1s 136us/step - loss: 0.0491 - mean_absolute_error: 0.1505 - mean_squared_error: 0.0491 - val_loss: 0.0517 - val_mean_absolute_error: 0.1496 - val_mean_squared_error: 0.0517
Epoch 990/1000
8000/8000 [==============================] - 1s 140us/step - loss: 0.0585 - mean_absolute_error: 0.1643 - mean_squared_error: 0.0585 - val_loss: 0.0776 - val_mean_absolute_error: 0.1960 - val_mean_squared_error: 0.0776
Epoch 991/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0762 - mean_absolute_error: 0.1877 - mean_squared_error: 0.0762 - val_loss: 0.1468 - val_mean_absolute_error: 0.2753 - val_mean_squared_error: 0.1468
Epoch 992/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0986 - mean_absolute_error: 0.2172 - mean_squared_error: 0.0986 - val_loss: 0.0773 - val_mean_absolute_error: 0.1891 - val_mean_squared_error: 0.0773
Epoch 993/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0557 - mean_absolute_error: 0.1624 - mean_squared_error: 0.0557 - val_loss: 0.0577 - val_mean_absolute_error: 0.1652 - val_mean_squared_error: 0.0577
Epoch 994/1000
8000/8000 [==============================] - 1s 137us/step - loss: 0.0548 - mean_absolute_error: 0.1633 - mean_squared_error: 0.0548 - val_loss: 0.0572 - val_mean_absolute_error: 0.1584 - val_mean_squared_error: 0.0572
Epoch 995/1000
8000/8000 [==============================] - 1s 140us/step - loss: 0.0637 - mean_absolute_error: 0.1765 - mean_squared_error: 0.0637 - val_loss: 0.0561 - val_mean_absolute_error: 0.1541 - val_mean_squared_error: 0.0561
Epoch 996/1000
8000/8000 [==============================] - 1s 141us/step - loss: 0.0689 - mean_absolute_error: 0.1777 - mean_squared_error: 0.0689 - val_loss: 0.0702 - val_mean_absolute_error: 0.1774 - val_mean_squared_error: 0.0702
Epoch 997/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0746 - mean_absolute_error: 0.1914 - mean_squared_error: 0.0746 - val_loss: 0.0586 - val_mean_absolute_error: 0.1658 - val_mean_squared_error: 0.0586
Epoch 998/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0521 - mean_absolute_error: 0.1586 - mean_squared_error: 0.0521 - val_loss: 0.0643 - val_mean_absolute_error: 0.1698 - val_mean_squared_error: 0.0643
Epoch 999/1000
8000/8000 [==============================] - 1s 138us/step - loss: 0.0760 - mean_absolute_error: 0.1958 - mean_squared_error: 0.0760 - val_loss: 0.0546 - val_mean_absolute_error: 0.1597 - val_mean_squared_error: 0.0546
Epoch 1000/1000
8000/8000 [==============================] - 1s 139us/step - loss: 0.0570 - mean_absolute_error: 0.1715 - mean_squared_error: 0.0570 - val_loss: 0.0670 - val_mean_absolute_error: 0.1753 - val_mean_squared_error: 0.0670
In [67]:
model = nn_clf()
model.load_weights(".mdl_wts.hdf5")

labels_estimated_testing_scaled = model.predict(input_test)

labels_estimated_testing = ScalerNN.inverse_transform(labels_estimated_testing_scaled)

fig, ax = plt.subplots(2, 3, figsize = (18,10))
for i in range(5):
    RMS = np.sqrt(mean_squared_error(labels_estimated_testing[:,i], labels_test[:,i]))
    RMS2 = np.sqrt(mean_squared_error(labels_estimated_testing_scaled[:,i], labels_test_scaled[:,i]))
    MAE = mean_absolute_error(labels_estimated_testing[:,i], labels_test[:,i])
    bias = np.sum(labels_estimated_testing[:,i] - labels_test[:,i])/nr_training_stars
    ax[i//3, i%3].scatter(labels_test[:,i], labels_estimated_testing[:,i], s=4, c="navy")
    ax[i//3, i%3].plot([-100,6000],[-100,6000], c="y")
    ax[i//3, i%3].set_xlim(*label_xlims[i]); ax[i//3, i%3].set_ylim(*label_xlims[i])
    ax[i//3, i%3].set_title(f"{label_names[i]} - Testing Data\nRMSE = {RMS:.4f}  |  RMSE(scaled) = {RMS2:.4f}\nMAE={MAE:.4f}    |    BIAS = {bias:.4f}")
    ax[i//3, i%3].set_xlabel("ASPCAP Input Labels"); ax[i//3, i%3].set_ylabel("Model Predicted Labels")
ax[-1,-1].axis('off')
plt.tight_layout();
In [64]:
MSE = np.array(history.history['mean_squared_error'])
val_MSE = np.array(history.history['val_mean_squared_error'])
plt.ylim(0, 0.5)
asdf = np.argsort(val_MSE)[:10]
plt.plot(MSE, c="b")
plt.plot(val_MSE, c="g")
plt.scatter(asdf, val_MSE[asdf], c="r")
Out[64]:
<matplotlib.collections.PathCollection at 0x7f452457ce10>

6.2 Considerations and Hindsight

A lot of approaches was attempted to increase the network's accuracy, with some small improvements gained after much trying and failing.

We believe the main reason the earlier predictive model outperforms the neural network is its direct access to the error data. We, therefore, attempted including the errors in the network learning set. This did not improve the results. I believe the main problem with this, using a standard dense layer, is that is allows for cross-communications between every flux-pixel and every error-pixel. In theory, the network should be able to map the correct error to the correct flux, given enough data, but with several thousand of each, I believe this seems futile in this scenario. A probably much stronger approach is, somehow, only allowing each flux-pixel to communicate with its own error-pixel, through some local layers.

We also had a lot of problems with convergence. Starting out, the loss function refused to converge, staying steadily at 1 forever. This went away when all layers were switching to linear (the last layer was of course already linear, due to presence of negative labels). Later, the network had a tendency to spike in loss and MSE by orders of magnitude at certain intervals (usually 50-200 epochs). This went away when the amsgrad option in the Adam optimizer was turned on. It is unknown why.

We also considered many methods for cleaning up bad data. Mostly by trying to reset "off" values to some default. For example, setting value < 0.2 or values > 1.2 to 1, which acts as sort of a default. This did not yield better results, probably because these offsets are also present in the confirmation set. None of the attempt produced any noticeable difference.

7. Result Comparison

Below we see a table of the RMSE and bias comparing to ASPCAP labels. We see that the second neural network implementation closes almost the entire gap between the first approach and the original model.

Table of RMSE and bias for each model

Model 1 NN 1 NN 2
Teff 74.32 (-11.0) 104.95 (-13.8 90.28 (-5.044)
Log(g) 0.1543 (-0.0093) 0.188 (-0.0261) 0.156 (-0.0187)
Fe/H 0.0537 (-0.0061) 0.0.0612 (0.0035) 0.0541 (-0.0038)
Mg/Fe 0.0516 (-0.0016) 0.0.0623 (--0.0034) 0.0510 (-0.0023)
Si/Fe 0.0558 (0.0077) 0.0650 (-0.0047) 0.0404 (-0.0012)